Build a Raspberry Pi Robot: Part 2

So here we are to part 2 of building a Raspberry PI based Humanoid Robot. For those who have not read I am attempting to create my own Humanoid robot using a Raspberry PI Zero (see https://renzedevries.wordpress.com/2016/02/28/building-a-raspberry-pi-humanoid-robot-in-java-part-1/). In this part of the series I will dive into the challenges of getting the robot to walk and also will start on some basic obstacle detection.

In the previous part I mostly focussed on the basic electronics setup and wiring all up, also I tested the concept of reading sensor data. I won’t focus on these part any more, the materials needed are still the same as mentioned in part 1.

What about walking?

So what is the fuss about walking, it is simply right, we do it every day so why can’t a robot do the same? Well as it turns out walking is one of the most complex things we humans do, without thinking about it we are actually constantly falling forward in a controlled manner and our brain just works this out. We have grown up knowing this from our childhood and our brain learned by trial and error, you must have seen videos of babies stumbling around :), well that was the trial and error bit.

So where does this leave a robot, well if it comes down to getting a humanoid robot to walk you want to achieve dynamic walking. This means that the robot during movement not always has a stable center of gravity and in essence is out of balance during walking. There is a lot of research going on in this area, most people trying to accomplish what we humans can already do.

Implementing Walking

So in this post I will zoom in a bit on how I have implemented a simple walking mechanism for my Humanoid robot. The approach I have taken is actually quite simple, most humanoid robot kit or controllers out there (Nao, Robotis, etc.) have some form of motion editing capability. This is a piece of software where you can record movements of your robot, where over a defined time you can record the precise status of your Joints & Servo’s.

The robot I am building is based on a Robotis Bioloid Premium robot kit and the kit actually comes with some motion software called RoboPlus which can be used to define motions for the Servo / Robot. What is also quite handy is that the RoboPlus software comes with a pre-defined motion file for Humanoid walking, which I can re-use for my own robot based on their kit with some light tweaks.

Java Motion Execution

The motion software actually generates a motion file containing all the motions and positions of the servo’s over time. In order to get the robot to walk I have to write some software that can read this motion file and execute the moves that are mentioned in there at the correct time points.

My goal is to actually build controller software in Java that runs on the Raspberry PI, so it only made sense to write the converter software as well in Java. The motion file that comes out of RoboPlus is actually a plain text file that is relatively simple to read, it defines the motion (Stand up, walk forward, Sit down, etc.) and per motion it defines KeyFrames (moments in time) which define the positions of all the servo’s.

For people that want to also be able to use RoboPlus motion files, you can check out the converter code i have written here: https://github.com/renarj/robo-sdk/blob/master/robo-core/src/main/java/com/oberasoftware/robo/core/robomotion/RoboPlusMotionConverter.java

Challenges

Now that the motion file is read the next challenge was actually running the motions. In the past I already had written the Java code to control the servo’s. So it’s just a matter of executing the motions then right? Well I faced significant challenges due to timings of the servo motions, walking motions are incredibly time sensitive. During walking there is a period the robot in essence is out of balance and only its forward momentum is keeping it going so if the timing is incorrect he can fall over, what happened a lot 🙂

A single keyframe typically executes between 30-50 milliseconds, in case of walking motion they are all below 30 ms. A normal modern computer would not think twice about such a time, in fact hooking the servo’s on my Mac made the robot walk very smoothly. However on a Raspberry PI Zero this is slightly more challenging, specially if running Java on there which is not the quickest on a PI.

Calculating a keyframe move itself is also relatively challenging, the distance a single servo has to travel is not the same as other servo’s. However the motion does need to be synchronous between for example the knee and the foot. Because the distance and time are the known variables here, the only variable that can still play a factor is speed. Per servo a calculation needs to be done how fast it needs to move to its new target location.

So here you can see some video’s of the first attempt, which is very wobbly:

However after quite some tweaking and optimising I managed to get the timings near perfect:

 

A short summary of the optimisations i had to make, it seems the Raspberry Pi Zero is quite bad at doing a lot of floating point calculations. And those calculations where needed to determine the speed a servo needed to move from current to a target position. Because walking in essence is a repetitive motion, you can simply cache all the calculations and movements and even pre-computer them before executing the motion.

Another optimisation was to prevent any multi-threading to happen, the original controller design was based on Async event driven methods. However it seems this is quite difficult on a single core Pi Zero. Simply inlining all the servo calls has fixed and ensured the robot walks smoothly.

Obstacle detection

This is turning out to be quite a long post, so let me wrap it up with some basic obstacle detection. In the last post you saw that I got some distance sensors wired up that gave a voltage back based on detected distance. I have written a bit of code into the controller framework that now converts this into a scaled distance and wired this up together with the walking mechanism.

This is how the controller code is now starting to look:


ADS1115Driver adsDriver = new ADS1115Driver();
adsDriver.init();
Robot robot = new SpringAwareRobotBuilder(context)
 .motionEngine(RoboPlusMotionEngine.class, new RoboPlusClassPathResource("/bio_prm_humanoidtypea_en.mtn"))
 .servoDriver(DynamixelServoDriver.class, ImmutableMap.<String, String>builder().put(DynamixelServoDriver.PORT, "/dev/tty.usbmodem1411").build())
 .sensor(new DistanceSensor("distance", adsDriver.getPort("A0"), new AnalogToDistanceConverter()))
 .sensor(new GyroSensor("gyro", adsDriver.getPort("A2"), adsDriver.getPort("A3"), new AnalogToPercentageConverter()))
 .build();
RobotEventHandler eventHandler = new RobotEventHandler(robot);
robot.listen(eventHandler);

And here the code that stops the robot when it detects an obstacle:

public static class RobotEventHandler implements GenericRobotEventHandler {
    private Robot robot;

    public RobotEventHandler(Robot robot) {
        this.robot = robot;
    }

    @EventSubscribe
    @EventSource("distance")
    public void receive(DistanceSensorEvent event) {
        LOG.info("Received a distance event: {}", event);

        if(event.getDistance() < 20) {
            LOG.info("Killing all tasks");
            robot.getMotionEngine().stopAllTasks();
        }
    }
}

And this is the resulting video of how the robot starts walking and stops just before a box, it is not perfect yet:

 

I am quite pleased with the progress so far, next time i hope to tell a bit about robot interaction. I am planning to hook up this and the Nao robot together and let them interact with each other via the Robot SDK I am currently developing. If you are interested, please keep an eye on this Github repository: https://github.com/renarj/robo-sdk