The Retribution‑Gap and Responsibility‑Loci Related to Robots and Automated Technologies: A Reply to Nyholm

Research output: Contribution to journal/periodicalArticleScientificpeer-review

Abstract

Automated technologies and robots make decisions that cannot always be fully
controlled or predicted. In addition to that, they cannot respond to punishment
and blame in the ways humans do. Therefore, when automated cars harm or kill
people, for example, this gives rise to concerns about responsibility-gaps and retribution-gaps. According to Sven Nyholm, however, automated cars do not pose a challenge on human responsibility, as long as humans can control them (even if only indirectly) and update them. He argues that the agency exercised in automated cars should be understood in terms of human–robot collaborations. This brief note focuses on the problem that arises when there are multiple people involved, but there is no obvious shared collaboration among them. Building on John Danaher’s discussion of command responsibility, it is argued that, although Nyholm might be right that autonomous cars cannot be regarded as acting on their own, independently of any human beings, worries about responsibility-gaps and retribution-gaps are still justifed, because it often remains unclear how to allocate or distribute responsibility satisfactorily among the key humans involved after they have been successfully identifed.
Original languageEnglish
Pages (from-to)1-9
JournalScience and Engineering Ethics
DOIs
Publication statusPublished - 02 Jul 2019

Keywords

  • agency
  • responsibility-gaps
  • retribution-gaps
  • human-robot collaborations

Fingerprint Dive into the research topics of 'The Retribution‑Gap and Responsibility‑Loci Related to Robots and Automated Technologies: A Reply to Nyholm'. Together they form a unique fingerprint.

  • Cite this