Cargando…

High level of correspondence across different news domain quality rating sets

One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains that a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the...

Descripción completa

Detalles Bibliográficos
Autores principales: Lin, Hause, Lasser, Jana, Lewandowsky, Stephan, Cole, Rocky, Gully, Andrew, Rand, David G, Pennycook, Gordon
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10500312/
https://www.ncbi.nlm.nih.gov/pubmed/37719749
http://dx.doi.org/10.1093/pnasnexus/pgad286
_version_ 1785105894910984192
author Lin, Hause
Lasser, Jana
Lewandowsky, Stephan
Cole, Rocky
Gully, Andrew
Rand, David G
Pennycook, Gordon
author_facet Lin, Hause
Lasser, Jana
Lewandowsky, Stephan
Cole, Rocky
Gully, Andrew
Rand, David G
Pennycook, Gordon
author_sort Lin, Hause
collection PubMed
description One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains that a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the reliability of these ratings. In this study, we compared six sets of expert ratings and found that they generally correlated highly with one another. We then created a comprehensive set of domain ratings for use by the research community (github.com/hauselin/domain-quality-ratings), leveraging an ensemble “wisdom of experts” approach. To do so, we performed imputation together with principal component analysis to generate a set of aggregate ratings. The resulting rating set comprises 11,520 domains—the most extensive coverage to date—and correlates well with other rating sets that have more limited coverage. Together, these results suggest that experts generally agree on the relative quality of news domains, and the aggregate ratings that we generate offer a powerful research tool for evaluating the quality of news consumed or shared and the efficacy of misinformation interventions.
format Online
Article
Text
id pubmed-10500312
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-105003122023-09-15 High level of correspondence across different news domain quality rating sets Lin, Hause Lasser, Jana Lewandowsky, Stephan Cole, Rocky Gully, Andrew Rand, David G Pennycook, Gordon PNAS Nexus Social and Political Sciences One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains that a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the reliability of these ratings. In this study, we compared six sets of expert ratings and found that they generally correlated highly with one another. We then created a comprehensive set of domain ratings for use by the research community (github.com/hauselin/domain-quality-ratings), leveraging an ensemble “wisdom of experts” approach. To do so, we performed imputation together with principal component analysis to generate a set of aggregate ratings. The resulting rating set comprises 11,520 domains—the most extensive coverage to date—and correlates well with other rating sets that have more limited coverage. Together, these results suggest that experts generally agree on the relative quality of news domains, and the aggregate ratings that we generate offer a powerful research tool for evaluating the quality of news consumed or shared and the efficacy of misinformation interventions. Oxford University Press 2023-09-02 /pmc/articles/PMC10500312/ /pubmed/37719749 http://dx.doi.org/10.1093/pnasnexus/pgad286 Text en © The Author(s) 2023. Published by Oxford University Press on behalf of National Academy of Sciences. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Social and Political Sciences
Lin, Hause
Lasser, Jana
Lewandowsky, Stephan
Cole, Rocky
Gully, Andrew
Rand, David G
Pennycook, Gordon
High level of correspondence across different news domain quality rating sets
title High level of correspondence across different news domain quality rating sets
title_full High level of correspondence across different news domain quality rating sets
title_fullStr High level of correspondence across different news domain quality rating sets
title_full_unstemmed High level of correspondence across different news domain quality rating sets
title_short High level of correspondence across different news domain quality rating sets
title_sort high level of correspondence across different news domain quality rating sets
topic Social and Political Sciences
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10500312/
https://www.ncbi.nlm.nih.gov/pubmed/37719749
http://dx.doi.org/10.1093/pnasnexus/pgad286
work_keys_str_mv AT linhause highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT lasserjana highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT lewandowskystephan highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT colerocky highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT gullyandrew highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT randdavidg highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets
AT pennycookgordon highlevelofcorrespondenceacrossdifferentnewsdomainqualityratingsets