by Jeff Eckert
Stereo Vision System
Bee Safer on the Road
cut the number of accident fatalities
and serious injuries to Nissan drivers by
half by 2015, as compared with 1995.
And Bee Careful of Flying
The SVT™ dual-camera,
dual-processor stereo vision system.
Courtesy of Surveyor Corp.
If your bot or other homebuilt
device needs 3D vision, check out the
Surveyor SVT™ : a dual-camera, dual-processor Wi-Fi system geared for
robotics, embedded image processing,
and Web-based remote monitoring.
Surveyor ( www.surveyor.com)
points to features including on-board
programmability, Wi-Fi connectivity,
easy sensor and actuator interface,
open source architecture, and a list
price of $550 as key attributes.
You may note from the photo
that the SVT employs Analog Devices’
Blackfin processors, which are
designed for such things as multi-format audio, video, voice and image
processing, etc. (details at www.ana
log.com). According to Surveyor
literature, the system takes “full
advantage of the processor’s power
efficiency, optimized video instructions, high speed video interface, and
easy interface to peripheral devices.”
It incorporates two BF537 32-bit
Blackfins, two Omnivision OV9655
1.3 megapixel cameras, PWM motor
control, various interfaces, and
3.3V and 5V regulation for battery
operation. You also get a Lantronix
Matchport 802.11bg radio, all of
which takes up only a 2.5 x 6 x 2 inch
volume and consumes less than 2W.
Programming can be via Windows,
Mac OS X, or Linux.
At the recent CEATEC Japan
show, Nissan ( www.nissan-global.
com) unveiled its concept for
automotive safety in the form of the
Biomimetic Car Robot Drive, also
known as BR23C. According to
Nissan, it is based on lessons learned
from the “humble bumblebee,”
specifically the bee’s vision system.
The insect’s compound eyes can
spot obstacles in a 300°+ range, allowing
it to fly safely within its personal space.
In the automotive version, Nissan’s
Laser Range Finder (LRF) system covers
180° and has a sensor range of 2 m
toward the front. The LRF calculates
the distance to another object and
sends the data to an onboard microprocessor, which translates it into a
collision avoidance strategy. But a bee
can bumble up, down, or sideways to
avoid other bees, whereas the BR23C
can only move in two dimensions and
only within the limitations of the wheels.
Presumably, the operating range of
2 m is a prototype limitation, as it would
appear advisable to detect a pending
collision with a bit more time to react.
Let’s say you’re driving toward a telephone pole or other stationary object at
60 mph (88 ft/sec) and detect an obstacle at a distance of 10 ft. That would
allow only about 1/10 sec reaction time
— enough time for an airbag to deploy,
but not so hot for braking or swerving.
Nissan says the device “only needs
to process inputs every few seconds,
and act on that.” Hmmm. In any event,
the company hopes that the system will
The KillerBee — a fully autonomous
UAS — will be used by the US Navy
and Marine Corps for both
ground-based and ship-launched
surveillance and reconnaissance.
Courtesy of Raytheon Corp.
Another ostensible offspring of
the genus Bombus is the KillerBee
unmanned aircraft system, which is
being provided to the US Navy and
Marine Corps by a team made up
of Swift Engineering and Raytheon
(details at www.killerbeeuas.com).
It looks more like a manta ray leaping
through the surface of the gulf, but
it’s considerably more lethal than
either one when connected to combat and command control systems.
According to a recent announcement,
“KillerBee has the ability to insert
persistent intelligence, surveillance,
and reconnaissance (ISR) into the battle space and rapidly deliver
actionable intelligence to
Late in September, a
Raytheon flight operations
crew simulated a combat
environment by delivering
the KillerBee system to
a remote location via
Humvees. In less than 45
minutes, the crew set up
Nissan’s BR23C uses compound “eyes” for collision
avoidance. Courtesy of Nissan Motor Company
Advanced Technology Center.
8 SERVO 12.2008