GETTING UNDER OUR SKIN
Skin is the human body's largest sensory organ.
Understanding how it works will help roboticists create
more useful android skins. Thanks to a new report
announced by Johns Hopkins researchers, we're a step closer
to understanding the skin's sensory system.
The scientists created detailed maps of the branching
patterns of sensory nerves in mouse skin. The resulting maps
revealed 10 distinct groups that seem to correspond to
differences in nerve functions. For example, some nerve
types gather information from a single hair follicle while
others branch into groups that collect averaged information
from 200 or more different locations.
The images now in hand will help scientists make more
sense out of known responses to stimulation of the skin. For
example, if a single nerve cell is responsible for monitoring a
patch of skin a quarter of an inch square, multiple
simultaneous points of pressure within that patch will only
be perceived by the brain as a single signal. As explained by
the researchers, this is why we can’t read Braille using the
skin on our backs. (Not that we would if we could.) The
multiple bumps that make up a Braille symbol are within
such a small area that the axon branches can’t distinguish
them. By contrast, each sensory axon on the fingertip
occupies a much smaller territory and this permits our
fingertips to accurately distinguish small objects.
Johns Hopkins researchers also announced the
discovery of strong evidence that there are specific nerve
cells responsible for itch signals, which are distinct from
nerves involved in pain.
24 SERVO 03.2013
FACE IT, KID
A recent UCSD news release is touting their new
android child that is designed to mimic the expressions of a
one year old human child as it learns to control its body and
interact with humans. This robot is a little less creepy and a
little more life-like than other baby bots, mainly because the
android combines the work of some of the best technology
out there: Japanese humanoid robotics hardware and a
Hanson Robotics head. David Hanson's android heads have
received wide-spread recognition as the most human-like
and expressive around.
From the news release:
"We developed machine-learning methods to analyze
face-to-face interaction between mothers and infants, to
extract the underlying social controller used by infants, and
to port it to Diego-San. We then analyzed the resulting
interaction between Diego-San and adults. With high
definition cameras in the eyes, Diego-San sees people,
gestures, expressions, and uses AI modeled on human babies
to learn from people the way that a baby hypothetically
would. The facial expressions are important to establish a
relationship, and communicate intuitively to people."
Diego-San was developed at the Machine Perception Lab
and funded by the National Science Foundation. The robot is
another small step towards robots that are able to interact
emotionally with humans. There's still a long way to go —
particularly with the development of affective systems in the
robot itself. Most work to date has been on expressing
simulated affects and on recognizing emotions in humans.
AN AMAZING OPPORTUNITY
Opportunity landed at Meridiani Planum nine
years ago this January. The original warranty on this
robot was just a mere 90 days. That’s fracking
So, what's Opportunity been up to? The robot
is currently hanging out in a place called Matijevic
Hill on the western rim of Endeavour Crater, some
22 miles from where Oppy touched down. Clay
minerals have been detected in this area from
orbit, suggesting that water may have modified the
rocks, so Opportunity will be checking the place
out down on the ground.
As for what's next, well, at some point this
little robot is going to stop working. It's already got
some quirks — nothing serious, but still, it's closing
in on a decade of non-stop operation. The fact that
we've got robots on Mars right now is something
that should be, a continual source of wonder for
every single human on Earth.