Building a Raspberry PI Robot Car part 2

In the last post we talked about the electronics, in this post I will talk a bit about the 3D design and printing of the components. I have recently acquired an Ultimaker 3D printer and after quite some experimenting has led me to be able to start designing my own components for the robotcar. In this blog I will try to walk through the design of the robotcar.

Designing the robot

The robot itself is going to consist of 4 main components:
* Casing containing the main electronics (Raspberry PI, power distribution, etc.
* Casing containing the LiPo battery that allows easy battery replacement
* Frame that supports both the battery and electronics casing
* Wheel / suspension mechanism to hold the wheels

Note: The printer has a maximum print size of roughly 20x20x20 cm, so this is the main reason that the casing for the power, electronics and frame are separated from each other.

The software
For the design of the software I started out with TinkerCad which is an online free based 3D editor. However I quickly ran into problems with dimensions which get quickly complex. I switched after this to Autodesk Fusion 360 which is a lot better if it comes to designing technical components, as a hobbyist it is possible to get a free year license.

Wheel / Suspension

The suspension design is a spring based design that will allow some form of flex in the wheel design. The wheel design actually needs to attach to a servo, the wheel itself is attached to the servo. For this I have designed a small bracket suited for my Dynamixel servo’s.

Next I have one beam that will have the spring attached to it and two static beams that connect to the servo holder. The static beams will ensure linear motion of the servo holder and the spring ensures there is dampening of the motions. This looks as following:

For the wheel design I will at some point dedicate a special post as they have caused me a lot of headache. For now I will use some standard wheels that fit onto the servo’s, but ultimately these will become mecanum based wheels.

Designing the frame

The beams used for the suspension are actually part of the base frame. There are going to be 4 wheels, meaning 4 beams that are part of the frame. In order to create sufficient surface for the battery and electronics casing I have connected the beams in a longer frame using connecting pieces. I have design an end piece for the end pieces of the frame and a middle piece to connect the beams all together. This looks as following:

Each of the beams has a length of 12cm, the middle piece is 4cm and the end pieces each 2cm. This gives a total length of 32cm for the robotcar, this is quite long but for the current suspension design it is needed as the suspension beams cannot really be shortened. In the future I might want to shorten the design by redesigning the suspension, however for now its good enough.

Battery & Electronics case

The main battery and electronics case has caused me a lot of problems and many iterations to get right. Every time you print it, there is something that is not entirely right. The tricks has been to measure, measure and measure again all the components you want to fit. I have in the end drawn out a sketch on paper roughly showing the placement of the components. Both the battery and electronics case have to fit in a fixed length of 16cm and 10 cm in width to fit the baseframe. The electronics case contains special accomodation for the Raspberry PI, UBEC power converter, two grove Sensors and the Dynamixel power board:

Note: The electronics casing will have a separate lid which will allow closing up the electronics compartment and allow easy access.

For the battery case its a lot simpler, we just need something to contain the battery. However one of the challenges is that I do not want a lid here, it just needs to be easily replaceable. For this to work there will be two covers on either end of the case that hide the wires but are far enough apart to remove the battery. A not here is that I used round edges instead of sharp 90 degree angles to allow for better printing without support. The round angles allow for a pretty decent print on my ultimaker, and its a lot better than having support material in the case. The case looks as following:

Assembling the robot

Here are a series of pictures of the various parts in stages of assembly

Conclusion

The process of getting to the above design and printed parts has not been easy. I have had for each component many, many iterations before getting to the above. Even now I am still seeing improvement areas, however for now I do think its close to being a functional robot car which was the goal. In the future posts I will start talking a bit about the software and the drive system with the mecanum wheels.

For those wanting to have a look at the 3D parts, I have uploaded them to Github, the idea is in the future to provide a proper manual on how to print and assemble with a bill of materials needed, for now just have a look:
https://github.com/renarj/robo-max/tree/master/3d-parts

Here is a last picture to close with of the first powerup of the robot car:

Advertisements

Remote Controlling Nao robot using a Raspberry Pi Robot

Today I want to take some time to write about the next step I am currently taking to have both my self-build Raspberry PI robot and the Nao robot interact with each other on a useful basis. You might have already seen some posts before like https://renzedevries.wordpress.com/2016/06/10/robot-interaction-with-a-nao-robot-and-a-raspberry-pi-robot/ about robot interaction or perhaps the model train one https://renzedevries.wordpress.com/2016/09/13/having-fun-with-robots-and-model-trains/. However both these posts did not really demonstrate a practical use-case.

Recently I presented about this topic at the Devoxx conference in Antwerp where I attempt to demonstrate how to control one robot from another using Kubernetes, Helm and Minikube combined with some IoT glue 🙂 The scenario I demonstrated was to create a Robotic Arm from my Raspberry PI robot that I use to remote control a Nao robot.

Robot arm
In order to have some form of remote control I have created a Robot Arm which i can use as a sort of joystick. I have created the robot from the same parts as described in this post (https://renzedevries.wordpress.com/2016/03/31/build-a-raspberry-pi-robot-part-2/). The robot arm is controller via a Raspberry PI that has a bit of Java software to connect it to MQTT to send servo position changes and to receive commands from MQTT to execute motions on the robot arm.

The robot looks like this:
67d466ea-019e-4216-a1e0-4d577bf7038e

Nao Robot
For the Nao robot I have written a customer Java controller that connects to the remote API of Nao. This controller software does nothing else but allowing remote control of the Nao robot by listening to commands coming from MQTT.

Connecting the Robots

Like before in previous setups I will be using my custom Robot Cloud deployment setup for this experiment. I will be deploying a number of micro-services to a Kubernetes cluster that is running on AWS. The most important public services are the MQTT message bus which is where the robots are sending status (sensors/servo’s) towards and received commands from (animations, walk commands etc.). For more detail on the actual services and their deployment you can check here https://renzedevries.wordpress.com/2016/06/10/robot-interaction-with-a-nao-robot-and-a-raspberry-pi-robot/

The most important part of bridging the gap between the robots is to have a specific container that receives updates from the servo’s on the robot arm. Based on events from those servo’s (move the joystick forward) I want to trigger the Nao robot to start walking. The full code with a lot more detail is available in this git repository: https://github.com/renarj/robo-bridge

Conclusion

It’s quite a complex setup, but the conclusion is that by using my Kubernetes deployed Robot Cloud stack I can use the robot Arm to control the Nao robot. If you want to see a bit more with a live demo you can check out my Devoxx presentation here:

One thing I could not demo at Devoxx was the interaction with a real Nao Robot, I have made a recording how that would look and also put this on youtube here:

Having fun with Robots and Model trains

Last few blog posts have all been about heavy docker and Kubernetes stuff. I thought it was time for something more light and today I want to blog about my hobby robotics project again. Before I had robots to tinker with actually I used to play around a lot with trying to automate a model train setup. So recently I had the idea why can’t I combine this and let one of the robots have some fun by playing with a model train 🙂

In this blog post I will use MQTT and my Robot Cloud SDK I have developed to hook up our Nao Robot to MQTT together with the model train.

Needed materials

In order to build a automated train layout I needed a model train setup, I have a already existing H0 based Roco/Fleischmann based model train setup. All the trains on this setup are digitised using decoders that are based on DCC. If you do not know what this means, you can read a about digital train systems here: http://www.dccwiki.com/DCC_Tutorial_(Basic_System)

Hooking up the train

The train system I have is controlled using an Ecos controller which has a well defined TCP network protocol I can use for controlling it. I have written a small library that hooks the controller to my IoT/robot cloud that I have described in previous blogposts. The commands for moving the train are sent to MQTT which are then translated to a TCP command the controller can understand.

I will have a MQTT broker available somewhere in the Cloud (AWS/Kubernetes) where also my robots can connect to so this will be the glue connecting the robot and trains.

I don’t really want to bother people to much with the technicals of the train and code behind it, but if you are interested in the code I have put it on Github: https://github.com/renarj/rc-train

Hooking up the robots

Hooking up the robots is actually quite simple, I have done this before and am using the same setup before. The details of this are all available in this blog post: https://renzedevries.wordpress.com/2016/06/10/robot-interaction-with-a-nao-robot-and-a-raspberry-pi-robot/

In this case I will be using our Nao Robot and hook this up to the MQTT bridge. The framework have developed contains a standard message protocol on top of MQTT. This means the messages are always defined the same way and all parties adhering to this can give states and commands to each other. In this case both the train and robot use the same message protocol via MQTT, hence why we can hook them up.

In order to make this a bit more entertaining I want to run a small scenario:
1. Nao walks a bit towards the train and the controller
2. Sits down and says something
3. Starts the train
4. Reacts when the train is running

I always like to put a bit of code in a post, so this code is used to create this scenario:

    private static void runScenario(Robot robot) {
        //Step1: Walk to the train controller (1.2 meters)
        robot.getMotionEngine().prepareWalk();
        robot.getMotionEngine().walk(WalkDirection.FORWARD, 1.2f);

        //Step2: Let's say something and sit down
        robot.getCapability(SpeechEngine.class).say("Oh is that a model train, let me sit and play with it", "english");
        robot.getMotionEngine().goToPosture("Sit");
        
        //Step3: Let's start the train
        sleepUninterruptibly(1, TimeUnit.SECONDS);
        startTrain(robot.getRemoteDriver(), "1005", "forward");
        
        //Step4: Nao is having lots of fun
        sleepUninterruptibly(5, TimeUnit.SECONDS);
        robot.getCapability(SpeechEngine.class).say("I am having so much fun playing with the train", "english");
    }

    private static void startTrain(RemoteDriver remoteDriver, String trainId, String direction) {
        remoteDriver.publish(BasicCommandBuilder.create("ecos")
                .item("train").label("control")
                .property("trainId", trainId).build());
        remoteDriver.publish(BasicCommandBuilder.create("ecos")
                .item("train").label("light")
                .property("trainId", trainId)
                .property("state", "on").build());
        remoteDriver.publish(BasicCommandBuilder.create("ecos")
                .item("train").label("direction")
                .property("trainId", trainId)
                .property("direction", direction).build());
        remoteDriver.publish(BasicCommandBuilder.create("ecos")
                .item("train").label("speed")
                .property("trainId", trainId)
                .property("speed", "127").build());
    }

What happens here is that in the Robot SDK there is a bit of code that can translate Java objects into MQTT messages. Those MQTT messages are then received by the train controller from the MQTT bridge which translates this again into TCP messages.

For people that are interested in also this piece of code on how I create the scenario’s around the Nao robot it’s also available on github: https://github.com/renarj/robo-pep

End result

So how does this end result look like, well video’s say more than a thousand words (actually ±750 for this post 🙂 )

This is just to show that you can have a bit of fun integrating very different devices. Using protocols like MQTT could really empower robot and other appliances to be tightly integrated very easily. The glue that I am adding is to make sure there is a standard message on top of MQTT for the different appliances and hooking them up to MQTT. Stay tuned for some more posts about my Robotics and hobby projects.