Cargando…

Trust within human-machine collectives depends on the perceived consensus about cooperative norms

With the progress of artificial intelligence and the emergence of global online communities, humans and machines are increasingly participating in mixed collectives in which they can help or hinder each other. Human societies have had thousands of years to consolidate the social norms that promote c...

Descripción completa

Detalles Bibliográficos
Autores principales: Makovi, Kinga, Sargsyan, Anahit, Li, Wendi, Bonnefon, Jean-François, Rahwan, Talal
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10229533/
https://www.ncbi.nlm.nih.gov/pubmed/37253759
http://dx.doi.org/10.1038/s41467-023-38592-5
_version_ 1785051274445586432
author Makovi, Kinga
Sargsyan, Anahit
Li, Wendi
Bonnefon, Jean-François
Rahwan, Talal
author_facet Makovi, Kinga
Sargsyan, Anahit
Li, Wendi
Bonnefon, Jean-François
Rahwan, Talal
author_sort Makovi, Kinga
collection PubMed
description With the progress of artificial intelligence and the emergence of global online communities, humans and machines are increasingly participating in mixed collectives in which they can help or hinder each other. Human societies have had thousands of years to consolidate the social norms that promote cooperation; but mixed collectives often struggle to articulate the norms which hold when humans coexist with machines. In five studies involving 7917 individuals, we document the way people treat machines differently than humans in a stylized society of beneficiaries, helpers, punishers, and trustors. We show that a different amount of trust is gained by helpers and punishers when they follow norms over not doing so. We also demonstrate that the trust-gain of norm-followers is associated with trustors’ assessment about the consensual nature of cooperative norms over helping and punishing. Lastly, we establish that, under certain conditions, informing trustors about the norm-consensus over helping tends to decrease the differential treatment of both machines and people interacting with them. These results allow us to anticipate how humans may develop cooperative norms for human-machine collectives, specifically, by relying on already extant norms in human-only groups. We also demonstrate that this evolution may be accelerated by making people aware of their emerging consensus.
format Online
Article
Text
id pubmed-10229533
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-102295332023-06-01 Trust within human-machine collectives depends on the perceived consensus about cooperative norms Makovi, Kinga Sargsyan, Anahit Li, Wendi Bonnefon, Jean-François Rahwan, Talal Nat Commun Article With the progress of artificial intelligence and the emergence of global online communities, humans and machines are increasingly participating in mixed collectives in which they can help or hinder each other. Human societies have had thousands of years to consolidate the social norms that promote cooperation; but mixed collectives often struggle to articulate the norms which hold when humans coexist with machines. In five studies involving 7917 individuals, we document the way people treat machines differently than humans in a stylized society of beneficiaries, helpers, punishers, and trustors. We show that a different amount of trust is gained by helpers and punishers when they follow norms over not doing so. We also demonstrate that the trust-gain of norm-followers is associated with trustors’ assessment about the consensual nature of cooperative norms over helping and punishing. Lastly, we establish that, under certain conditions, informing trustors about the norm-consensus over helping tends to decrease the differential treatment of both machines and people interacting with them. These results allow us to anticipate how humans may develop cooperative norms for human-machine collectives, specifically, by relying on already extant norms in human-only groups. We also demonstrate that this evolution may be accelerated by making people aware of their emerging consensus. Nature Publishing Group UK 2023-05-30 /pmc/articles/PMC10229533/ /pubmed/37253759 http://dx.doi.org/10.1038/s41467-023-38592-5 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Makovi, Kinga
Sargsyan, Anahit
Li, Wendi
Bonnefon, Jean-François
Rahwan, Talal
Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title_full Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title_fullStr Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title_full_unstemmed Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title_short Trust within human-machine collectives depends on the perceived consensus about cooperative norms
title_sort trust within human-machine collectives depends on the perceived consensus about cooperative norms
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10229533/
https://www.ncbi.nlm.nih.gov/pubmed/37253759
http://dx.doi.org/10.1038/s41467-023-38592-5
work_keys_str_mv AT makovikinga trustwithinhumanmachinecollectivesdependsontheperceivedconsensusaboutcooperativenorms
AT sargsyananahit trustwithinhumanmachinecollectivesdependsontheperceivedconsensusaboutcooperativenorms
AT liwendi trustwithinhumanmachinecollectivesdependsontheperceivedconsensusaboutcooperativenorms
AT bonnefonjeanfrancois trustwithinhumanmachinecollectivesdependsontheperceivedconsensusaboutcooperativenorms
AT rahwantalal trustwithinhumanmachinecollectivesdependsontheperceivedconsensusaboutcooperativenorms