Professor Masatoshi Ishikawa from the University
of Tokyo is showing off their 1,000-frames-per-second
camera using a pan-tilt system to track a ping pong ball.
The device is so fast it can always keep the ball in the
center of the frame.
Possible applications include tracking balls or
players on sports broadcasts, and recording detailed
dynamics of a flying bird or fast moving vehicles.
How do they do it? The camera uses a custom
vision chip that monitors what pixels are changing,
and by doing that one thousand times per second it
can keep track of fast moving objects (bouncing balls,
flipping pages, falling eggs, etc.).
Broadcasting sport games is quite popular.
Hence, high quality and powerful videos are in great
demand. However, it is often hard for camera
operators to keep tracking their camera's direction
on a dynamic object such as a particular player, a ball,
and so on. Current methods have been limited to
either moving the camera's gaze slowly with a wide
angle of view, or controlling the gaze based on a
prediction. Super slow and close-up videos of the
player or ball are thought to be especially quite
valuable. However, camera operators have not been
able to do that as well as they’d like to. To help solve this issue, the Ishikawa Oku Laboratory developed 1 ms auto pan-tilt
technology. This technology can automatically control the camera's pan-tilt angles to keep an object always at the center, just
like autofocus keeps an object in focus. Even a high speed object like a bouncing pingpong ball in play can be tracked at the
center due to a high speed optical gaze controller Saccade mirror and 1,000 fps high speed vision. The Saccade mirror controls
a camera's gazing direction not by moving the camera itself but by rotating two-axis small galvanometer mirrors. It controls the
gaze by 60 deg — the widest angle — for both pan and tilt. Steering the gaze by 40 deg takes only 3. 5 ms. The newest
prototype system accesses a full HD image quality for actual broadcasting.
AILA — BESMAN FOR THE JOB
DFKI Bremen’s humanoid robot AILA is being readied for work in space, thanks
to 3. 8 million euros in funding by the German Aerospace Center (DLR). Project
BesMan (Behaviors for Mobile Manipulation) will run the next four years to develop
the control software necessary to teleoperate robots in space. Specifically, the robot
will mimic human movements of the torso, arms, and hands. Already, AILA has been
given a new pair of five-fingered hands which are much more capable than the
fingerless pads it had before (they only picked up boxes, which doesn’t really require
fingers). Like NASA’s Robonaut R2 and Russia’s SAR-400, AILA ISS will be required to
grasp and use tools, as well as operate control panels. Although it will be teleoperated
by a human on Earth most of the time, it will also need to perceive changes in the
environment and act independently should the need arise.
Researchers are already thinking beyond the space station; the software will be
designed to work with robots of varying shapes, from humanoids like DLR’s Justin to
multi-legged climbing robots. It could then be used to teleoperate robots designed to
assemble solar panel energy stations on the Moon ahead of a manned mission.
In order to recreate human-like movements, the researchers are experimenting
with a motion-capture system. Basically, a researcher in the lab performs an action
which is then simulated on the computer. The software will break up the movements
into smaller segments that can be sent into space and used when necessary. “We must
build systems that approach the capabilities of people,” says Prof. Dr. Frank Kirchner,
Director of DFKI Robotics Innovation Center and the Robotics Group at the University of Bremen.
AILA ISS shows
off her new
SERVO 09.2012 29