Cargando…
Dealing with digital misinformation: a polarised context of narratives and tribes
The advent of the internet and social networks has revolutionised the information space and changed the way in which we communicate and get informed. On the internet, a huge amount of information competes for our (limited) attention. Moreover, despite the increasing quantity of contents, quality may...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley and Sons Inc.
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7015504/ https://www.ncbi.nlm.nih.gov/pubmed/32626457 http://dx.doi.org/10.2903/j.efsa.2019.e170720 |
_version_ | 1783496808544075776 |
---|---|
author | Zollo, Fabiana |
author_facet | Zollo, Fabiana |
author_sort | Zollo, Fabiana |
collection | PubMed |
description | The advent of the internet and social networks has revolutionised the information space and changed the way in which we communicate and get informed. On the internet, a huge amount of information competes for our (limited) attention. Moreover, despite the increasing quantity of contents, quality may be poor, making the environment particularly florid for misinformation spreading. In such a context, our cognitive biases emerge, first and foremost, confirmation bias, i.e. the human tendency to look for information that is already in agreement with one's system of beliefs. To shade light on the phenomenon, we present a collection of works investigating how information gets consumed and shapes communities on Facebook. We find that confirmation bias plays a crucial role in content selection and diffusion, and we provide empirical evidence of the existence of echo chambers, i.e. well separated and polarised groups of like‐minded users sharing a same narrative. Immersed in these bubbles, users keep framing and reinforcing their world view, ignoring information dissenting from their preferred narrative. In this scenario, corrections in the form of fact‐checking or debunking attempts seem to fail and have instead a backfire effect. To contrast misinformation, smoothing polarisation is so essential, and may require the design of tailored counter‐narratives and appropriate communication strategies, particularly for sensitive topics. |
format | Online Article Text |
id | pubmed-7015504 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | John Wiley and Sons Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-70155042020-07-02 Dealing with digital misinformation: a polarised context of narratives and tribes Zollo, Fabiana EFSA J Engagement and Expertise The advent of the internet and social networks has revolutionised the information space and changed the way in which we communicate and get informed. On the internet, a huge amount of information competes for our (limited) attention. Moreover, despite the increasing quantity of contents, quality may be poor, making the environment particularly florid for misinformation spreading. In such a context, our cognitive biases emerge, first and foremost, confirmation bias, i.e. the human tendency to look for information that is already in agreement with one's system of beliefs. To shade light on the phenomenon, we present a collection of works investigating how information gets consumed and shapes communities on Facebook. We find that confirmation bias plays a crucial role in content selection and diffusion, and we provide empirical evidence of the existence of echo chambers, i.e. well separated and polarised groups of like‐minded users sharing a same narrative. Immersed in these bubbles, users keep framing and reinforcing their world view, ignoring information dissenting from their preferred narrative. In this scenario, corrections in the form of fact‐checking or debunking attempts seem to fail and have instead a backfire effect. To contrast misinformation, smoothing polarisation is so essential, and may require the design of tailored counter‐narratives and appropriate communication strategies, particularly for sensitive topics. John Wiley and Sons Inc. 2019-07-08 /pmc/articles/PMC7015504/ /pubmed/32626457 http://dx.doi.org/10.2903/j.efsa.2019.e170720 Text en © 2019 European Food Safety Authority. EFSA Journal published by John Wiley and Sons Ltd on behalf of European Food Safety Authority. This is an open access article under the terms of the http://creativecommons.org/licenses/by-nd/4.0/ License, which permits use and distribution in any medium, provided the original work is properly cited and no modifications or adaptations are made. |
spellingShingle | Engagement and Expertise Zollo, Fabiana Dealing with digital misinformation: a polarised context of narratives and tribes |
title | Dealing with digital misinformation: a polarised context of narratives and tribes |
title_full | Dealing with digital misinformation: a polarised context of narratives and tribes |
title_fullStr | Dealing with digital misinformation: a polarised context of narratives and tribes |
title_full_unstemmed | Dealing with digital misinformation: a polarised context of narratives and tribes |
title_short | Dealing with digital misinformation: a polarised context of narratives and tribes |
title_sort | dealing with digital misinformation: a polarised context of narratives and tribes |
topic | Engagement and Expertise |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7015504/ https://www.ncbi.nlm.nih.gov/pubmed/32626457 http://dx.doi.org/10.2903/j.efsa.2019.e170720 |
work_keys_str_mv | AT zollofabiana dealingwithdigitalmisinformationapolarisedcontextofnarrativesandtribes |