Cargando…
The Retribution-Gap and Responsibility-Loci Related to Robots and Automated Technologies: A Reply to Nyholm
Automated technologies and robots make decisions that cannot always be fully controlled or predicted. In addition to that, they cannot respond to punishment and blame in the ways humans do. Therefore, when automated cars harm or kill people, for example, this gives rise to concerns about responsibil...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Netherlands
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7089880/ https://www.ncbi.nlm.nih.gov/pubmed/31267376 http://dx.doi.org/10.1007/s11948-019-00120-4 |
Sumario: | Automated technologies and robots make decisions that cannot always be fully controlled or predicted. In addition to that, they cannot respond to punishment and blame in the ways humans do. Therefore, when automated cars harm or kill people, for example, this gives rise to concerns about responsibility-gaps and retribution-gaps. According to Sven Nyholm, however, automated cars do not pose a challenge on human responsibility, as long as humans can control them (even if only indirectly) and update them. He argues that the agency exercised in automated cars should be understood in terms of human–robot collaborations. This brief note focuses on the problem that arises when there are multiple people involved, but there is no obvious shared collaboration among them. Building on John Danaher’s discussion of command responsibility, it is argued that, although Nyholm might be right that autonomous cars cannot be regarded as acting on their own, independently of any human beings, worries about responsibility-gaps and retribution-gaps are still justified, because it often remains unclear how to allocate or distribute responsibility satisfactorily among the key humans involved after they have been successfully identified. |
---|