MobBob is a smart phone controlled, 3D printed, companion robot. By harnessing the power of a smart phone, MobBob is a walking, talking robot with voice recognition and computer vision that you can build for around $30.
He can currently: walk, talk, understand voice commands, play peekaboo, and follow a coloured ball.
I will be continuing to extend his features over time. My goal is to make MobBob a companion robot that everyone can afford and have fun with.
He is a 4-servo biped, and uses a Bluno Beetle (or Nano) as his microcontroller. The Bluno has Bluetooth LE built in, so I use that for communication with the phone. On the phone, I have an app written using the Unity 3D game engine.
See him in action here!
Using Computer Vision to follow a ball:
Playing Peek-a-boo with me:
Showing off his moves:
MobBob's servos are controlled by DFRobot's Bluno Beetle board. It is an Arduino-compatible board with built-in bluetooth LE
The Android phone then communicates with the Bluno Beetle using bluetooth LE. (So no explicit pairing is needed.)
The Arduino Software (Firmware)
The Arduino code accepts both high-level commands (like "walk 3 steps forward", "turn left", "do the wobble animation 3 times"), and low-level commands to directly set the positions of the servos.
The Android App (Main software)
The app code was written in C# using the Unity 3D engine. Using a game engine for robotics actually makes a lot of sense! For a start, games are real-time, always-running applications that respond to a frequent input... very similar to what's needed for a robot app. It also makes it easy to create dynamic visuals like MobBob's face, and easy to create nice user interfaces for interacting with MobBob.
And while I'm not using this in MobBob's app at the moment, game engines also provide support for representing 3D spaces (and obstacles in 3D spaces), and provides algorithms for things like path finding. Also, it would be possible to use a game engine's animation system to drive servo positions!
The voice recognition uses Google's speech-to-text functionality which is accessible on Android.
The computer vision is using OpenCV.
I wanted to make MobBob act alive, so his face is always animating, and stays on the screen even when the menu panel is displayed.
I will be continuing to add functionality to MobBob's app over time! He's going to do a lot more!
And, if other hackers want to build their own MobBobs and create their own apps... I would love to play with what other makers create. :D
The 3D Printed Parts
When I built the original MobBob, I was still figuring how to make him work so, there was some trial and error and hot glue!
However, I have refined the design with MobBob V2. V2 is easier to print and assemble, and it's possible to tweak the position of the parts during assembly to ensure that he is evenly balanced. (You need to make sure the mobile battery booster, and the phone balance each other out to keep him stable.)
Also, V2's new mounting holes can also be used to add MobBob accessories!! I have some fun ideas for these and will be putting them on Thingiverse soon. Stay tuned! :D
Resources for MobBob
I'm now also writing about MobBob on my own blog here: http://www.cevinius.com/mobbob/
The Bluno (Arduino) code for MobBob can be found on GitHub:
The latest version of the Android app can be downloaded from Google Play:
(The app is free to download, has no ads, and no IAP.)
I've also written some detailed assembly and wiring instructions here:
Update 27th September 2015 - MobBob now has Arms!!
I've created some optional arms for MobBob. The files can be found on Thingiverse here:
I will make some fully 3D-printable arms down the track, but for now, I've made some adapters for attaching Lego Technics arms.
I've made 2 sample arms at the moment. One is a static, poseable claw. The other has a Sharp IR distance sensor on it.
The IR distance sensor arm is poseable, and the sensor can be aimed up or down. I thought that it would enable it to be used both for detecting obstacles (such as walls) and for detecting edges that it doesn't want to fall off.
I haven't updated the software yet. I'll be doing that this week. I'll add support for the sensor in the Arduino code, and add some functionality in the app to make use of this new data.
I've also written more about it on my blog: