FIGURE 5. Y0U0 Format.
optical energy is lost at 650 nm. The shorter wavelength of
the 635 nm diode means more optical power can pass
through the camera’s filter. More optical power means
better detection of the red laser spot by the OVM7690
within an environment and, thus, a more likely successful
laser range measurement. Further discussion on webcam’s
IR filters can be found at http://www.david-
For those with safety concerns, the few documented
cases of eye damage with Class IIIa devices (which include
— for example — most run-of-the-mill red laser pointers,
laser levels, and laser-based thermometers) are related to
someone staring at the beam for a prolonged period
ters). The laser diode for the LRF is only enabled for a
single frame capture which currently takes ~400 mS.
OVM7690 Camera Interface
The Omnivision OVM7690 provides a digital interface
to the host microcontroller:
• DVP[7:0] (Digital video port): Eight-bit wide
output bus corresponding to pixel information sent
in the selected output format from the OVM7690
(RAW RGB, RGB565, CCIR656, or
• VSYNC (Vertical sync): Indicates the beginning of a
new frame by pulsing high.
• HREF (Horizontal reference): Indicates the start of
the next row of pixels by pulsing high. By keeping
count of the number of HREF pulses received since
the last VSYNC, we can determine which horizontal
line of the video frame we are currently on.
• PCLK (Pixel clock): Asserted when valid pixel data is
available on the DVP bus. For a 640 pixel line in
YUV422 format (16 bits/pixel), we should see
10,240 pixel clock cycles after an HREF pulse.
To help reduce intellectual property theft, OmniVision
(like many other camera vendors) does not release many
specific camera operating details. Documentation was thin
60 SERVO 10.2011
on how the registers needed to be configured to get the
camera up and running. I had to port the start-up settings
from a file provided with Omnivision’s PC-based evaluation
tool (OVTATool) which communicates to an Omnivision
camera module over a USB interface, and reverse-engineered a few additional settings by watching the bus
communication between the camera and PC host.
All told, there were nearly one hundred eight-bit
registers requiring configuration, including — but not
limited to — general settings, output format selection,
resolution, frames per second, lens correction values, color
matrix values, edge and de-noise settings, Automatic Error
Control (AEC), Automatic Gain Control (AGC), Automatic
White Balance (AWB), gamma values, and flicker control.
I configured the camera for its maximum 640x480
VGA resolution and for data output in the YUV422 format
( http://en.wikipedia.org/wiki/YUV and
http://en.wikipedia.org/wiki/YCbCr). Y is the luma
component — brightness in grayscale — and U and V are
chroma components — color differences of blue and red,
respectively. The particular format of YUV422 used by the
OVM7690 is known as YUY2 ( www.fourcc.org/yuv.php),
in which each 16-bit pixel is given an eight-bit Y
component and alternating eight-bit U or eight-bit V
component. Y0U0 corresponds to a single pixel starting
from the left, Y1V0 is the second pixel, etc. Every location
has Y data, and U and V are every other pixel. Check out
Timing is Everything:
My First Frame Capture
My first attempt at retrieving the digital video signal
from the OVM7690 using my custom development
platform looked fine in theory, but the frame grabber
routine was written in Spin and too slow for the camera. I
started over from scratch and re-wrote it in the more
efficient Propeller Assembly (PASM).
The frame grabber cog only runs when started by a
Joe Grand is an electrical engineer and the president
of Grand Idea Studio ( www.grandideastudio.com),
where he specializes in the design and licensing of
consumer products and modules for electronics hobbyists.
He can be reached at email@example.com.