40 SERVO 08.2017
To post comments on this article and find any associated files and/or downloads, go to
In robotics, where concurrent processes
need to take place (servo
navigation, etc.), an FPGA
can have advantages over
approach. This is because
the FPGA fabric lets the
designer create multiple
independently as shown
in Figure A. Each of the
individual logic modules
shown in the figure can
run at full clock speed,
concurrently, and without
affecting the speed of
neighboring logic or the
higher level state
A microprocessor handles each task sequentially, and
can be interrupted by higher priority tasks making latency
non-deterministic. In some cases, this can be handled when
the peripheral set is fixed, or by using multi-processor
cores. Generally, however, the processor operates
sequentially as in Figure B where one process can stall
execution of the next function.
The goal in developing a design to interface to the
Silicon Labs PMOD sensor was two-fold. In order to start
building an FPGA based robot like Figure A, it’s necessary
to develop the interfaces for sensors or peripherals
controlled by the FPGA. One of the most common sensor
interfaces is I2C, and so part of this project’s goal was to
create a robust I2C interface module for use later. The
Verilog code is written to be cleanly portable to any FPGA
architecture with very little effort.
The second goal was to be able to use the capabilities
of the PMOD sensor itself. The
temperature/humidity/ambient UV and light capabilities are
self-explanatory for a mobile robot that wants to sense the
environment. These values are read from a register after a
measurement command, and some calculations yield a
result in Centigrade, relative humidity, or lumens.
The proximity sensing using the PMOD is more
complex. The sensor does not report “there is a reflecting
object 20 centimeters from the sensor.” Instead, it reports a
set of raw lumen values that can change as a result of the
target reflectance, ambient infrared conditions, and optical
leakage. This is shown in Figure C1.
However, in my case, there is no overlay; so,
presumably no optical leakage.
Not knowing what the raw values would “look like” for
various targets, I used the developed interface design to
make measurements on a representative target. In my case,
I used a new soda can at distances between 5-40
centimeters from the sensor. While I could see
reproducible differences in the results, I couldn’t derive a
simple “threshold” value to declare the presence of
an object. This was because the target reflectance
as well as the ambient infrared added to each other
could change the raw value radically.
The Silicon Labs applications note (AN498)
gives a method of “dynamic baselining” to establish
the optical environment and correct for slow
changes that are not a result of a target in the visual
Thinking about this, either baselining or some
simple filtering in FPGA logic could be used to do
the last step of interpreting the sensor data and
reaching the goal of sensing objects near the robot.
The filtering logic is perfectly suited to the
FPGA, but the scope of that logic was beyond what
I wanted to show with the initial interface design.
As a second project, I plan to add the logic that can
filter the raw data looking for targets in the visual
Figure C. Proximity sensor — baselining concept.
Figure A. FPGA robotics implementation. Figure B. Microprocessor robotics implementation.
1. Silicon Labs, AN498 Si114x Designer’s Guide, p. 27