calling object. It then grabs a single
frame from the camera and stores it
in the frame buffer. I’ve allocated
5,280 longs (~ 20. 6 KB) in hub RAM
for the frame buffer, leaving just
under 4 KB for the stack and
Grabbing a frame consists of
waiting for VSYNC to go high which
signals the start of a new frame,
then waiting for HREF to go high
which signals the start of a new
line. Then, pixel data (alternating
Y and U/V bytes) is captured from
DVP[7:0] every time PCLK goes high
and is stored in the frame buffer. After the complete frame is stored in the
buffer, the cog sets a flag in hub RAM to a non-zero state so the calling object
knows that the frame grab is done. The cog then stops itself.
I needed to increase the Propeller’s system clock from 80 MHz to 96 MHz
(using a 6.0 MHz crystal) in order to meet the camera’s timing requirements.
Even at a relatively slow PCLK of 2 MHz, I only have 0.500 µS to properly read
and store each pixel. At 96 MHz — where each cycle takes 0.01042 µS — that
equates to 48 cycles. Since most instructions on the Propeller take four cycles, I
don’t have much time to work with!
To make the timing even more tricky, the camera’s image data (passed in
on DVP[7:0]) is only valid when PCLK is high, so I have to read the data within
24 cycles (the first half of PCLK). Then, I have the next 24 cycles — while PCLK is
low — to store the data into the frame buffer and increment counters/pointers.
Since I had yet to write any tools to aid in acquiring images from the
camera, I went through a very manual process to create a bitmap image for
viewing. The LRF dumped the frame in printable ASCII through its serial port to
the Parallax serial terminal. Then, I copied the bytes into a hex editor, saved it as
a .RAW file, imported the .RAW into Photoshop, and saved it as a .BMP. It was
time-consuming, but worked just fine for development purposes. Check out a
video of the process in action at www.youtube.com/watch?v=URqUYhg4IvI.
It took me a few days to tweak the frame grabber cog’s timing (I was
accidentally missing pixels, which caused a corrupted image) and to fix some
incorrect automatic exposure settings (which were giving me extremely dark
images). Once the bugs were squashed, I was able to grab my first correct
image (see Figure 6). This was a major milestone of the LRF project.
FIGURE 6. Giving the “thumbs up” to the
camera to celebrate a successful frame
grab (actual lower res image
from the camera).
Until Next Time ...
Happy with my progress thus far, the next steps are to complete the
hardware design, write the image processing routines, and test the LRF with
some real world measurements. I'll discuss all of that in the next article.
See you then! SV
Need high-quality audio in
a miniature package? The
A8520 Anaren Integrated
Radio (AIR) module offers:
> Uncompressed wireless
digital audio/full CD quality
( 44.1/48KHz/16 bits)
> Robust RF protocol – with
virtually no RF engineering
> Tiny 11 x 19 x 2.5mm
> Integral antenna
FCC, IC, ETSI (pending)
To learn more, write AIR@
anaren.com, visit www.anaren.
com/air, or scan the QR code
with your smart phone.
In Europe, call +44-2392-232392
SERVO 10.2011 61