remove the electronics as one unit. I often refer to this as
the “stack.” I replaced the 12V SLA brick with an 11.4V 3s
LiPo battery pack that I could mount in the aluminum
channel from the kit.
The Edison runs on a version of Linux called Yocto,
which I had never heard of before. ROS, however, is built to
operate with Ubuntu and other Debian derivatives. There
was not an install path for Yocto Linux. However, there was
some success using UbiLinux which is, supposedly, a light
version of Ubuntu. So, I installed UbiLinux on the Edison
and then ROS applying the same build instructions used for
the Raspberry Pi. The installation was successful and I had
However, when I went to install the test program, I
could not get the gamepad to work. It turns out the
joystick drivers are not in UbiLinux. On top of that, the
package needed to install the drivers was not part of
UbiLinux. With no community support for that application
and after spending two weeks looking for answers that
simply didn’t exist, I abandoned the Edison board.
What did have a strong and growing community was
the Raspberry Pi. I happened to have a couple of model B+s
on hand. I knew I could get the test code to run on the Pi
since I had just done it on a project for some film makers in
To simplify the I/O with the sensors, etc., I decided to
also include an Arduino Mega. There were several reasons
for this, not the least of which was the separation of signal
processing from the main program. I would be able to
manage the sensors on the Arduino and just send the
consolidated information as a serial stream to the Pis. With
this, the stack took on a new form with the Roboclaw on
the bottom, followed by two Pis, and the Arduino with a
sensor shield on top. It was also at this point I decided to
add an Xbox Kinect for vision.
At some point, I was further inspired by Actobotics’
Mantis platform. In particular, I was enamored by the
independent suspension which, of course, Nomad had to
have. So, I spoke to the good folks at ServoCity and soon
received the parts for the conversion.
The idea behind the two Raspberry Pis was to — again
— split the processing. Vision processing is intensive, as is
navigation. With two boards, I could dedicate one to
processing the input from the Kinect and the other to
navigation, or as I learned during this process, SLAM
(Simultaneous Localization And Mapping).
ROS installed nicely and I was able to get the test
program running without a hitch. At that point, I started
adding the sensors. Using a package for ROS called
ROSSerial, I was able to read the PING))) sensors through
the Arduino. These were configured to act as a bumper to
avoid any obstacles the Kinect may have missed. The Kinect
itself, however, proved a bigger challenge. I could simply
not get the drivers to work. Others had great success with
this, but at the time I was not one of them.
By now, I was a year into the project. I was trying to
get the Kinect working right up to the next big show for
The Robot Group: South by Southwest (SXSW). For those
of you who aren’t familiar with SXSW, it is a large festival in
Austin, broken into three parts: interactive, film, and music.
SERVO 01.2017 49
By Jeff Cicolani Post comments on this article at www.servomagazine.com/index.php
Nomad with the Intel Edison board and sensor shield.
Roboclaw at the bottom of the "stack.” Nvidia Jetson TX1 unboxing.