Cargando…
Perceptions of Justice By Algorithms
Artificial Intelligence and algorithms are increasingly able to replace human workers in cognitively sophisticated tasks, including ones related to justice. Many governments and international organizations are discussing policies related to the application of algorithmic judges in courts. In this pa...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Netherlands
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10102053/ https://www.ncbi.nlm.nih.gov/pubmed/37070085 http://dx.doi.org/10.1007/s10506-022-09312-z |
_version_ | 1785025619395870720 |
---|---|
author | Yalcin, Gizem Themeli, Erlis Stamhuis, Evert Philipsen, Stefan Puntoni, Stefano |
author_facet | Yalcin, Gizem Themeli, Erlis Stamhuis, Evert Philipsen, Stefan Puntoni, Stefano |
author_sort | Yalcin, Gizem |
collection | PubMed |
description | Artificial Intelligence and algorithms are increasingly able to replace human workers in cognitively sophisticated tasks, including ones related to justice. Many governments and international organizations are discussing policies related to the application of algorithmic judges in courts. In this paper, we investigate the public perceptions of algorithmic judges. Across two experiments (N = 1,822), and an internal meta-analysis (N = 3,039), our results show that even though court users acknowledge several advantages of algorithms (i.e., cost and speed), they trust human judges more and have greater intentions to go to the court when a human (vs. an algorithmic) judge adjudicates. Additionally, we demonstrate that the extent that individuals trust algorithmic and human judges depends on the nature of the case: trust for algorithmic judges is especially low when legal cases involve emotional complexities (vs. technically complex or uncomplicated cases). SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10506-022-09312-z. |
format | Online Article Text |
id | pubmed-10102053 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer Netherlands |
record_format | MEDLINE/PubMed |
spelling | pubmed-101020532023-04-15 Perceptions of Justice By Algorithms Yalcin, Gizem Themeli, Erlis Stamhuis, Evert Philipsen, Stefan Puntoni, Stefano Artif Intell Law (Dordr) Original Research Artificial Intelligence and algorithms are increasingly able to replace human workers in cognitively sophisticated tasks, including ones related to justice. Many governments and international organizations are discussing policies related to the application of algorithmic judges in courts. In this paper, we investigate the public perceptions of algorithmic judges. Across two experiments (N = 1,822), and an internal meta-analysis (N = 3,039), our results show that even though court users acknowledge several advantages of algorithms (i.e., cost and speed), they trust human judges more and have greater intentions to go to the court when a human (vs. an algorithmic) judge adjudicates. Additionally, we demonstrate that the extent that individuals trust algorithmic and human judges depends on the nature of the case: trust for algorithmic judges is especially low when legal cases involve emotional complexities (vs. technically complex or uncomplicated cases). SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10506-022-09312-z. Springer Netherlands 2022-04-05 2023 /pmc/articles/PMC10102053/ /pubmed/37070085 http://dx.doi.org/10.1007/s10506-022-09312-z Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/ Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Original Research Yalcin, Gizem Themeli, Erlis Stamhuis, Evert Philipsen, Stefan Puntoni, Stefano Perceptions of Justice By Algorithms |
title | Perceptions of Justice By Algorithms |
title_full | Perceptions of Justice By Algorithms |
title_fullStr | Perceptions of Justice By Algorithms |
title_full_unstemmed | Perceptions of Justice By Algorithms |
title_short | Perceptions of Justice By Algorithms |
title_sort | perceptions of justice by algorithms |
topic | Original Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10102053/ https://www.ncbi.nlm.nih.gov/pubmed/37070085 http://dx.doi.org/10.1007/s10506-022-09312-z |
work_keys_str_mv | AT yalcingizem perceptionsofjusticebyalgorithms AT themelierlis perceptionsofjusticebyalgorithms AT stamhuisevert perceptionsofjusticebyalgorithms AT philipsenstefan perceptionsofjusticebyalgorithms AT puntonistefano perceptionsofjusticebyalgorithms |