Humanoid Robot

I am somewhat obsessed with the HBO television series Westworld, in which Dolores Abernathy is one of the central characters. She is an artificially created mechanical being, also known as a ‘host’ designed to mimic a human. The manufacturing process for the hosts as depicted in the show appears to use an advanced version of 3D printing technology. Project Dolores is my attempt to create a 3D printed humanoid robot using standard hobby servos and electronics. All the plastic parts were designed in Fusion 360 and printed in PLA using my CR-10S printer.

My robot’s main processor is an Arduino Mini Mega 2560. This is a module identical to the normal Mega 2560, but in a smaller form factor. There’s a BY8301 sound module mounted on the perfboard under the Arduino for on-board recordings, and an MH-M38 Bluetooth audio receiver to stream audio from a phone or from an Alexa smart-home device, or whatever. There’s also a HC-05BT Bluetooth transceiver that connects to the remote. There is a provisional connection for a ESP-01 WiFi receiver, but this has not been implemented as of this writing. Three DC-DC converters are used to power the 23 servos used (5 in each arm, 8 in the head, and 5 in the legs). Originally, I was planning to have at least 5 servos in each leg, but it turned out that they were not strong or fast enough to make this work as the development proceeded. So the latest version has Dolores in a standing position that allows some forward and backward movement and rotation in-place action.

The remote is based on an Arduino Nano processor, along with another HC-05BT Bluetooth transceiver. There are several pushbuttons and joysticks, as well as an LCD display. The power for the remote is a USB power pack mounted on the back. The remote originally had a set of different joysticks as shown in the internal photo below, but these were changed over to PlayStation-style joysticks that self center as shown in the exterior photo.

I started with one hand and arm, which went through a couple iterations to get to the current design. Here are a couple videos of the early arm development:



The head was the next part designed. The teeth are borrowed from another project on Thingiverse, but I designed the jaw mechanism as well as my own eye and eyelid movement mechanisms. The neck is controlled with three servos to allow tilt in all directions as well as rotation of the head. The eyes are 20mm acrylic doll eyes bought from a seller on ebay. The speaker inside the head is connected to the BY8301 sound module.  A couple magnets are used as the attachment points for the face and head.

In honor of the HBO Westworld series, here’s a video of an early version of the eye and mouth mechanisms paired with an audio track from the show:


A face and head was designed to fit over and around the eye/mouth mechanism, but it still needed something for it truly to be called “Dolores” – a 11 inch doll wig from Factory Direct Craft was perfect! She has a face that only a mother (or father) could love:

Along the way I found a synthetic speech processor that created files suitable for use as Dolores’s voice. Here are a couple more demo videos as things progressed:



I got creative with some videos for Christmas and Valentine’s Day:



Originally Dolores was intended to walk, but I knew it would be a difficult challenge given the limitations of hobby servos. I tried two different designs for the legs. The first one had the joints directly driven by the servos. An MPU-6050 IMU was mounted on the circuit assembly on the body with the idea to actively balance the robot. Unfortunately it turned out that the servos in this configuration just weren’t strong enough to support and move the robot. The leg servos are DS3218 type which while rated at 20 kg-cm torque, this was not enough and the robot sagged under its own weight.  Here is a video of the early leg development tests before I attempted to connect everything together to the torso assembly:


I tried again with a second leg design that used joints connected to the servos using pushrods to give a mechanical advantage in torque. This did work somewhat better to support the robot – she definitely could stand and her arms could raise while the hip was moved back to compensate. The problem now was the relatively slow response time of the servos was amplified (slowed down) by the same factor as the torque was multiplied, and this did not have enough speed to be able to balance the robot when it was taking a step. I attempted to augment the IMU processing with force sensors in the feet to help identify if the robot was tending to lean forward or backward, but again, the slowness of the servo response did not allow this idea to work successfully.

Here is a video of the V2 legs and the robot standing and balancing while the arms are raised. The weight of the arms is more than enough to tip the balance point forward, so the hips automatically move back to compensate.


The third and current version of the legs is a much simpler design where the feet are both fixed to the baseplate but allow swivel motion through bearings in the soles of the feet. The hip, thigh, and ankle servos still are connected similar to the second leg version’s pushrod design and allow the legs to tilt forward and back and swivel slightly to the left and right. The latest software swivels the legs corresponding to the direction that the arms are pointed and moves the hips back and forward as the arms are raised or lowered. While she can move and dance, unfortunately Dolores can’t walk.

The torso houses the processing and communications boards, the power converters, and the battery. I decided to move the battery to a housing on the baseplate to make it easier to remove it for recharging, but it still can easily be put inside the body if desired. The body has a couple spring-loaded clips that allow it to be opened and tilted forward to insert and remove the battery. However, that’s not really necessary anymore since she’s permanently attached to the baseplate.

The arm and leg control software uses an inverse kinematic (IK) model to let the user move the remote’s joysticks to manipulate a virtual point in front of the robot forward/backward, left/right and up/down. The arm joint positions are computed with the IK algorithm such that the centroid of a line connecting the two index fingers follows the virtual point as it is moved around. There are also some keep-out zone lockouts to prevent the hands from crashing into the body and head as the arms are moved. The arms have 6 modes of movement:

  • Normal: the arms move forward/backward and up/down together, but they move left/right in opposite directions – in other words, the hands move towards or away from each other in response to the x joystick motion.
  • X reverse: same as normal mode, except that the hands move left/right in the same directions – when the x joystick is moved left, both hands move left, when the x joystick is moved right, both hands move right.
  • Y reverse: same as normal mode, except that the forward/backward motion is reversed – when the y joystick is moved, one arm moves forward and the other moves backward and vice versa.
  • Z reverse: same as normal mode, except that the up/down motion is reversed – when the z joystick is moved, one arm moves up and the other moves down and vice versa.
  • XY reverse: same as normal mode, except that both the left/right and forward/backward motions are reversed.
  • YZ reverse: same as normal mode, except that both the forward/backward and up/down motions are reversed.

So by switching modes while ‘puppeteering’ the robot a wide variety of motions can be easily accomplished. With a little practice it’s actually pretty intuitive and easy to do.

As described earlier there are actually two audio sources – the BY8301 sound module with pre-recorded files and an MH-M38 Bluetooth audio receiver that can connect to whatever Bluetooth source you want. The BY8301 sound module is connected to the speaker inside the head, and the MH-M38 Bluetooth audio receiver is connected to the two speakers on the front of the torso. An electronic audio signal from both modules are fed to analog inputs of the the Arduino Mega 2560, where a filter algorithm performs an auto-gain function on the signals to normalize the level and then filters it and drives the mouth servo so the movement appears to be fairly well synchronized to the audio. The arms and legs also are programmed to respond and move automatically to peaks in the sounds. The remote control joystick inputs also work while this is all going on, so it’s easy to manually control and augment the robot’s motions while the automation is running.

The following is a demo video that showcases one of the onboard files – what you’re hearing is a recording of synthetic speech processed through an auto-tune audio algorithm on my PC and then loaded onto the BY8301 sound module as an mp3 file.  This video and the three that follow below are Dolores filmed using my iPhone and a green screen.  The videos of Dolores are overlaid on the Westworld backgrounds using the Chroma Key filter in the OpenShot Video Editing software.


Here are several videos that demonstrate the Amazon Alexa integration. For these demonstrations, the robot is effectively a Bluetooth speaker connected to the Echo Dot with the audio processed in real time to move the arms and legs according to the sounds:




Comments are closed.