Integrating a VR controller with a 3-D Printed Robot Arm

This is a milestone post for my series describing my experience building a 3-D printed InMoov robot when using the Raspberry Pi 3 and ROS.  The source code is now available here.  In my last post, I described the development of a user interface to adjust the position of the servos using a touch screen.  Since then, I have repackaged the hardware from a breadboard implementation to a “permanent breadboard” implementation making it more compact and neat.  Here is a video that highlights that progress.

Taking a Leap

The next step in this project is to integrate a Leap Motion virtual reality controller.  The Leap Motion controller uses infrared cameras to monitor the position of your hands and fingers in a space in front of your monitor.  The intent is to use the controller for controlling what is happening on your computer including a virtual reality.  In this case, we are going to use it to have the robot hand mimic my hand.

conceptual-architecture
Conceptual architecture of the Leap Motion integration with a Raspberry Pi-based robot arm
  1. The Leap Motion is plugged into my MacBook Pro via an USB cable.  I installed the V2 Desktop SDK so that I could get the hand and finger position data through an API.  The Leap Motion monitors your hands and generates up to 100 frames per second of JSON data that is available on a WebSocket API hosted by a daemon process.  The documentation is above average but still has a few inconsistencies.
  2. As I have mentioned previously, I am running an Ubuntu virtual machine on my MacBook Pro and have installed ROS on that virtual machine.  I originally intended to install ROS natively on the Mac, but I was unable to get that working.  As a result, a ROS Publisher runs on the Ubuntu and is responsible for bridging the Leap Motion data with the ROS messaging infrastructure.  This component converts the JSON-based Leap Motion data into a nearly complete version suitable for ROS messaging. This component cannot run on the Raspberry Pi because cannot keep up with the bandwidth generated by the Leap Motion.  While converting the JSON-based message into a ROS message, the frequency of publishing is governed down to 20 frames per second.
  3. The Raspberry Pi subscribes to the ROS-compatible Leap Motion messages and converts the position vectors from Leap Motion into angle goals for the hand servos. This is done through the use of an ROS Subscriber (to receive the Leap messages and convert to a goal) and an ROS Action Server (to update the hand and finger positions).

The diagram below depicts these activities:

leap-motion-activity-diagram

The published code depends very heavily on the physical implementation of the Raspberry Pi, the servo controller and the servo channel assignments.  This diagram shows how I have assembled most of the hardware:

inmoov-servo-controller_bb

Here is demonstration of how well the integration works.  There is a bit of a delay between my hand moving and the robot reacting and it appears that two fingers need some adjustment.

I am now at a point where I have to decide on my next steps in this project.  There appears to be a minor bug in the wrist rotation, so that has been commented out for now.

Advertisements