What Is LabVIEW?
LabVIEW is a graphical programming
environment used by millions of engineers and
scientists to develop sophisticated measurement,
test, and control systems using intuitive
graphical icons and wires that resemble a
flowchart. LabVIEW offers integration with
thousands of hardware devices and provides
hundreds of built-in libraries for advanced
analysis and data visualization. The LabVIEW
platform is scalable across multiple targets and
signals the motors so they turn the robot, move it forward
or backwards, or even keep it stationary.
NIRo is a good showcase of what the NI platform can
do to provide the foundation for large, complex robots that
do object avoidance and exploration. “The use of LabVIEW
Real-Time, LabVIEW FPGA, and Single-Board RIO enables
developers to easily integrate hardware and software, and
to rapidly design, develop, and deploy algorithms,”
The Element is another demo built for teleoperated
search and rescue. The tank style treads and ability to build
everything inside the robot stems from a platform from
MESA Robotics. Two motors mobilize the robot controlled
by an NI CompactRIO controller. A webcam provides the
operator with a view of Element’s line of vision. A Windows
PC communicates with the robot via 802.11 wireless. The
operator enters the robot commands via a 360 wireless
controller. Teleoperated from a distance of 200 feet, the
human operator uses the controller to maneuver the robot
within its intended environment. Thanks to the treads, this
includes difficult terrain (indoors and outdoors) and stairs.
The controller sends signals via Bluetooth wireless to a
laptop running LabVIEW, then on from there to the
controller on the robot. The controller processes the
commands in real time and forwards them to the smart
motors for action.
LabVIEW for ARM Microcontrollers
LabVIEW Robotics Software Bundle
Other LabVIEW Data
The NIcholas demo robot is a smaller scale autonomous UGV.
Nicholas uses a LIDAR sensor to visualize its surroundings by
sensing distance/depth as the radar bounces off of surfaces.
Element uses a sensor called an inclinometer to read its
level of elevation to keep track of the terrain. The data is
useful in constructing a 3D model of the robot on its
NIcholas is a much smaller scale UGV that uses
autonomous navigation techniques to explore its
surroundings. The robot gathers a visualized impression of
its environment using a Hokuyo Laser Rangefinder (LIDAR).
A Vector Field Histogram (VFH) algorithm helps it to avoid
obstacles. NIcholas uses an inertial measurement unit to
sense velocity and acceleration.
The LIDAR sensor is connected to a serial port on the
processor, while the IMU is connected to digital lines on the
Single-Board RIO. The motors are also connected to this.
Obstacle avoidance and drive topologies are much the
same as with the NIRo robot. The LIDAR uses a laser to
measure the distances and angles to nearby objects. This
information is sent to the VFH algorithm to compute a path
which best avoids these nearby obstacles. Once the path is
computed, the FPGA is used to send PWM signals to the
drive and steering motors.
The NI robots make great experimental platforms to
develop and test algorithms on. Robot recipe code is
available as a free download to the community. SV
SERVO 08.2010 13