14/05/22

Back in 1984 James Cameron showed us in The Terminator what the future might hold if machines begin to think for themselves. It is particularly interesting to see what unfolds as we’re on the verge of creating autonomous military robots capable of deciding what is right and what is wrong.

Check out: Robots With Morals? The DoD is Spending $7.5 Million to Make it Happen

Listen

robots

Glossary

  • capable  –  having the ability, fitness, or quality necessary to do or achieve a specified thing
  • outcome – the way a thing turns out; a consequence
  • discern – recognize or find out
  • encounter – unexpectedly be faced with or experience (something hostile or difficult)
  • dissent – the holding or expression of opinions at variance with those commonly or officially held
  • off-beat – unconventional; unusual
  • brazen – bold and without shame

 

Think about it

Answer the questions below.

  • What does President Obama think about humanoid robots?
  • How do proponents of “moral robots” justify their stance?
  • What is operational morality and functional morality?
  •  What kind of moral decisions might robots have to make in a disaster scenario?
  • What arguments do opponents of “moral robots” use?

 

Practice makes perfect

 

 In the sentences below replace the fragments in bold with the words and phrases used in the original article.

  • The Office of Naval Research has allocated $7.5 million in grant money over the next five years for university researchers to build a robot with moral reasoning capabilities.
  • Supporters of the plan argue a “sense of moral consequence” could allow robotic systems to operate as one part of a more efficient — and truly autonomous — defense infrastructure.
  • And some of those advocates think pre-programmed machines would make better decisions than humans, since they could only obey strict rules of engagement and calculate potential results for multiple different scenarios.
  • Human lives and property depend on the outcomes of these decisions and so it is vital to make them carefully and with full knowledge of the capabilities and limitations of the systems involved.

 

Fill in the blank sapces with the missing words. Use ONE word per blank space.

“Even if ________ systems aren’t armed, they may still be forced to ________ moral decisions,” Bello said. ________ instance, in a disaster scenario, a robot may be forced to make ________ choice about ________ to evacuate or treat first, a situation where a bot might use some sense of ethical or moral reasoning. “While the kinds of systems we envision have much broader use in first-response, search-and-rescue and in ________ medical domain, we can’t take the idea of in-theater robots completely ________ the table,” Bello said.

Explore it more

 

(1122)