Front angle of hexapod, in the wild and
staring right at you!
called Q Learning to help the robot educate itself to
walk straight forward. Using a 2D Q matrix, there
are rows that represent the robot’s current state,
and columns that represent the possible state it
could transition to. The robot is in one row and
randomly chooses its next state. The robot records
sensory information that tells it how well it
performed on the task of moving straight forward.
If it did well, it receives a high Q value which is a
reward for learning well. Then, its new column is its
new state and the process begins again.
As for the sensory information, this is mostly
visual. The vision computation for the Q Learning
requires that the robot calculate the optic flow
between two successively taken shots of the same image
before and after it moves. If the robot moves straight
forward, the vector points in the vector field should all
move outward in the second image. The robot achieves
higher Q values for moving straight forward, and lower
ones for moving other than straight forward. The robot
uses algorithms both to determine which are high and low
value movements and to decide whether to use a recorded
high value movement or to try to duplicate or better a
previous high value movement. Records of high value
movements increase the robot’s performance of forward
motion over time as the number of records increases.
The vision programming uses OpenCV, which stands for
Open Computer Vision. OpenCV is a library of vision
computation functions and related functions used to make
vision computations easier to implement.
For the Intel robot version, Matt used an OpenCV
function to detect faces in the camera image the robot has
taken. This outputs the location of the center of the face in
the image. “I wrote a small program to move the
camera in such a way as to center the largest face in
the image,” says Matt. This makes it appear that the robot
is interacting with people.
Shake Your Body, Hexapod
Because Matt used a more traditional approach to
operating the hexapod robot known as inverse kinematics
(IK), the stepping time as the robot steps through its
various gaits is very rigid, repetitive, and rhythmical, so it is
often interpreted as coordinating well with music, i.e.,
dancing. “This is different from any other program that
uses a learning algorithm,” comments Bunting.
IK enables the robot to position its feet each in a
specific X, Y, Z location using trigonometry to solve for the
angles the motors should move into to get to those specific
sets of X, Y, Z coordinates. “You have to measure all the
distances of all the linkages of the legs, and determine how
to set the angles of the servo motors in order to accomplish
this,” Matt explains.
Hexapod casts big,
The fit-PC2 from Intel — which applies the Intel
Atom processor — enabled the second hexapod
design with both power and computational
efficiency. “The Atom processor can actually run
without a heatsink while decompressing and
displaying high definition video,” says Matt Bunting,
now University of Arizona PhD student in electrical
and computer engineering. The paired Poulsbu
chipset found on the fit-PC2 would usually require
cooling, but not in this application.
“It is actually impressive to be able to run a full
blown Ubuntu installation on a tiny 1.6 GHz machine
with kinematic and vision processing code, while
only using five watts of power,” Bunting affirms. “It is
quite an impressive little machine.”
SERVO 09.2011 11