Image Description
Named "Carry" as it is built to carry tools, the Workshop Waifu v.1.38 is a robot designed to bring a little life to the workshop. In the mech development lifecycle, it is the first attempt to control heavy-duty mech-grade motors with a computer (raspberry pi 3B+). Controlling motors with a computer stead of manually opens up quite a few possibilities.
17 April 2022
The basic frame was roughed out in PVC with motors zip-tied to the frame and connected to a tractor battery to see if it was possible for the unit to move.
18 April 2022
Testing basic emoting by displaying an image of eyes that will randomly move around, and be briefly replaced by an image of eyelids to simulate blinking. The amount of emotion that can be interpreted in just a pair of eyes is quite astounding.
21 April 2022
Using a bluetooth controller/mouse and code to translate mouse coordinates into motor motion. The brief high-pitched whirring motion accompanying each change in driving motion is actually a servo physically flipping a switch with 3 positions: motor forward, backward and neutral.
It is much simpler to control a high-current motor via a switch/servo instead of multiple relays, and more visually interesting too!
24 April 2022
Making the Workshop Waifu much taller opens up many possibilities, including more tool storage space and more personal interaction with people.
27 April 2022
Adding more neck/head articulation and finishing the head surface makes it look much more like a robot.
12 May 2022
By adding a variable resistor(potentiometer) to the abdominal joint, the computer can detect the position of the linear actuator and as a consequence move it to a desired position. This opens up unlimited possibilities for future mech developments, such as pre-programmed sequences. Eventually, a single button can be used to throw a punch or any other physically possible emotive action.
Further aesthetic paneling is added with EVA foam/floor mat. The material is easy to work with and very friendly for human interaction.
26 May 2022
While testing the face trackin for the head, the eye graphics went a little wonky. Running multiple loops in python at a relatively usable speed takes some work to get just right
30 May 2022
The first voice command is implemented, making the robot respond to its name. Using pocket sphinx and the voice_recognition library, it is much easier than one would expect, and completely self-contained. No google API or cloud voice control APIs here!
05 July 2022
With all the code running at the same time and aesthetic paneling (mostly complete) the workshop Waifu's function has been proven. The only thing that remains is a more stable method of locomotion, perhaps tank treads hidden under the skirt...
Created with Mobirise web page templates