Saturday, January 25, 2014

Robotic warfare laws great for science fiction, not so much in the real world

Unless we insist. And insist and insist.

“The best known set of laws are Isaac Asimov's "Three Laws of Robotics". The Three Laws are:

1. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. “A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3. “A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”

“’(T)he dominance enjoyed by the United States in the late 1990s/early 2000s in the areas of high-end sensors, guided weaponry, battle networking, space and cyberspace systems, and stealth technology has started to erode. Moreover, this erosion is now occurring at an accelerated rate.’” Work and Brimley, quoted at:

“All of our technologies [today] rely on a reliable, redundant, and secure network….If we lose that, we lose all the advantages.” -- TRADOC Col. Christopher Cross, quoted in same reference.


(From a non-robotic source: Planners, scientists, technicians and bureaucrats should not forget low-level technology that might be able to defeat super electronic/computerized weapons systems. Night vision devices did not see every guerrilla in the jungle, people sniffers did not find every VC/NVA, seismic probes dropped in Laos and Cambodia did not count every person who walked by. Serbs in the NATO bombing campaign used open-door microwave ovens to mimic antiaircraft radar. Every system has an operational weakness or two. Or three.)

No comments:

Post a Comment