Mind / Iron
by Bryan Bergeron, Editor ª Published Monthly By T & L Publications, Inc.
430 Princeland Ct., Corona, CA 92879-1300
FAX (951) 371-3052
Webstore Only 1-800-783-4624
Toll Free 1-877-525-2539
Outside US 1-818-487-4545
P.O. Box 15277, N. Hollywood, CA 91615
VP of OPERATIONS
Tom Carroll Kevin Berry
Dennis Clark R. Steven Rainwater
Michael Simpson John Blankenship
Samuel Mishal Thomas Henry
Chris Savage Robert Doerr
Steven Nelson Pete Smith
Dylan McCarthy Bryce Woolley
Copyright 2014 by
T & L Publications, Inc.
All Rights Reserved
All advertising is subject to publisher’s approval.
We are not responsible for mistakes, misprints,
or typographical errors. SERVO Magazine assumes
no responsibility for the availability or condition of
advertised items or for the honesty of the
advertiser. The publisher makes no claims for the
legality of any item advertised in SERVO. This is the
sole responsibility of the advertiser. Advertisers and
their agencies agree to indemnify and protect the
publisher from any and all claims, action, or expense
arising from advertising placed in SERVO. Please send
all editorial correspondence, UPS, overnight mail,
and artwork to: 430 Princeland Court, Corona,
Printed in the USA on SFI & FSC stock.
ERVO FOR THE ROBOT INNOVATOR
6 SERVO 08.2014
Machine intelligence that is in some way superior to human intelligence is
often touted as the ultimate goal of AI research and development. Machines
have long been capable of making decisions and, in many cases, these decisions
are superior to those made by average humans. A common GPS wouldn’t pass
the Touring Test, but if I were lost in some big city, I’d refer to it before asking a
random biped on the street.
In medicine, cardiac monitoring machines have been able to interpret EKG
waveforms with excellent accuracy for decades. And yet, humans have remained
in the loop. One reason is to be doubly certain that the machine-rendered
diagnosis is correct. Another is legal liability. Interestingly, morality isn’t an issue.
When it comes to autonomous weaponry — whether a heat-seeking missile
or a drone equipped with optical recognition — machines are at least as capable
as human operators. However, at least publicly, drones and other autonomous
and semi-autonomous machines all have a human in the loop — not because of
limited “intelligence,” but for moral reasons.
I think human morality in the loop is a stopgap measure. Today, it’s
politically correct. Tomorrow, humans will be so outnumbered by autonomous
machines — including autonomous weapons — that it won’t be possible to have
humans in every kill-or-no kill decision loop. I suppose that will be the time of
the autonomous “terminator.”
Similarly, in medicine, in a resource-limited situation with all else being
equal, should a robotic triage nurse attend to the old woman or young child
first? Or, what of the robotic surgeon performing an operation on a pregnant
woman with complications. Should the robot save the woman or the child?
The way I see it, intelligence is a minor hurdle. So, what’s the point of an
intelligent machine that lacks a moral compass? It may not matter if the
machine is operating, say, the power grid – unless, of course, there’s a decision
Personal CNC Mills
PCNC 1100 Series 3
Shown here with
Shown below is an articulated humanoid
robot leg, built by researchers at the
Drexel Autonomous System Lab (DASL)
with a Tormach PCNC 1100 milling
machine. DASL researcher Roy Gross
estimates that somewhere between 300
and 400 components for “HUBO+” has
been machined on their PCNC 1100.