Since the programming cable sourced power
to the module, we were able to test the program
in situ after retrieving the SD card from the
computer. The toothy smile appeared as expected.
Upon touching the screen, it switched to the
tongue picture, as if the screen was trying to get a
taste of the trespassing finger.
As an aside, we were impressed by the
capabilities of the touch screen. The internal
commands allow the user to set specific regions of
the touch screen as responsive to different inputs.
Given the size of the widescreen LCD module, it
could make a nice touch pad controller for a bot,
and premade button objects in the Visi interface
would make putting it together a breeze.
VisiGenie would even allow the design of such
a controller without having to type in code at all.
After importing a button or other object into
VisiGenie, users simply assign events to correspond
to button input in an easy to use table.
With the touch screen programming sorted
out, the power of analogical reasoning made
programming the OLED module a cake walk. To change
from one expression to the other, we used a simple if-else
construct and the GPIO internal functions. We programmed
the screen to switch from a still picture of a forward gazing
Eye to the animated GIF of the burning Eye when a certain
digital input pin read high. Of course, we needed a sensor
to create that input, so we couldn’t as easily test our
program and we had to get the robot base ready first.
The VEX robot base was equipped with two limit
switches at its front corners and a bumper sensor in the
middle. Each sensor could be wired to a different digital
input on the smart module, allowing for a different
expression every time the robot encountered a differently
positioned obstacle. The challenge here was to really
integrate the smart modules into the robot.
Instead of passive displays, we wanted the bot’s face to
react to the exact stimuli that caused the robot to avoid
obstacles autonomously. At a high level, the solution is
simple – wire the signal lead from the sensors to both the
VEX brain and the smart modules. The most obvious way to
do this, however, had its own set of entanglements. The
most uninspired way to do this was to splice three leads
onto the signal wire of each sensor. One would go to the
VEX brain, and the others would go to each smart module.
This, however, could quickly escalate to an unmanageable
web of wires. There had to be a better way.
Of course, there was. Instead of splicing together wires
with wild abandon, a better solution was to leave the wires
on the sensor alone and have it plug into the VEX brain
SORTING OUT THE VEX WITH EASY C.
normally. The smart modules could then wire into digital
output ports on the brain itself. This might not cut down
READY TO FACE THE WORLD.
SERVO 05.2013 73