Cargando…

Toward a more nuanced understanding of probability estimation biases

In real life, we often have to make judgements under uncertainty. One such judgement task is estimating the probability of a given event based on uncertain evidence for the event, such as estimating the chances of actual fire when the fire alarm goes off. On the one hand, previous studies have shown...

Descripción completa

Detalles Bibliográficos
Autores principales: Branch, Fallon, Hegdé, Jay
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10101207/
https://www.ncbi.nlm.nih.gov/pubmed/37063564
http://dx.doi.org/10.3389/fpsyg.2023.1132168
_version_ 1785025460661387264
author Branch, Fallon
Hegdé, Jay
author_facet Branch, Fallon
Hegdé, Jay
author_sort Branch, Fallon
collection PubMed
description In real life, we often have to make judgements under uncertainty. One such judgement task is estimating the probability of a given event based on uncertain evidence for the event, such as estimating the chances of actual fire when the fire alarm goes off. On the one hand, previous studies have shown that human subjects often significantly misestimate the probability in such cases. On the other hand, these studies have offered divergent explanations as to the exact causes of these judgment errors (or, synonymously, biases). For instance, different studies have attributed the errors to the neglect (or underweighting) of the prevalence (or base rate) of the given event, or the overweighting of the evidence for the individual event (‘individuating information’), etc. However, whether or to what extent any such explanation can fully account for the observed errors remains unclear. To help fill this gap, we studied the probability estimation performance of non-professional subjects under four different real-world problem scenarios: (i) Estimating the probability of cancer in a mammogram given the relevant evidence from a computer-aided cancer detection system, (ii) estimating the probability of drunkenness based on breathalyzer evidence, and (iii & iv) estimating the probability of an enemy sniper based on two different sets of evidence from a drone reconnaissance system. In each case, we quantitatively characterized the contributions of the various potential explanatory variables to the subjects’ probability judgements. We found that while the various explanatory variables together accounted for about 30 to 45% of the overall variance of the subjects’ responses depending on the problem scenario, no single factor was sufficient to account for more than 53% of the explainable variance (or about 16 to 24% of the overall variance), let alone all of it. Further analyses of the explained variance revealed the surprising fact that no single factor accounted for significantly more than its ‘fair share’ of the variance. Taken together, our results demonstrate quantitatively that it is statistically untenable to attribute the errors of probabilistic judgement to any single cause, including base rate neglect. A more nuanced and unifying explanation would be that the actual biases reflect a weighted combination of multiple contributing factors, the exact mix of which depends on the particular problem scenario.
format Online
Article
Text
id pubmed-10101207
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-101012072023-04-14 Toward a more nuanced understanding of probability estimation biases Branch, Fallon Hegdé, Jay Front Psychol Psychology In real life, we often have to make judgements under uncertainty. One such judgement task is estimating the probability of a given event based on uncertain evidence for the event, such as estimating the chances of actual fire when the fire alarm goes off. On the one hand, previous studies have shown that human subjects often significantly misestimate the probability in such cases. On the other hand, these studies have offered divergent explanations as to the exact causes of these judgment errors (or, synonymously, biases). For instance, different studies have attributed the errors to the neglect (or underweighting) of the prevalence (or base rate) of the given event, or the overweighting of the evidence for the individual event (‘individuating information’), etc. However, whether or to what extent any such explanation can fully account for the observed errors remains unclear. To help fill this gap, we studied the probability estimation performance of non-professional subjects under four different real-world problem scenarios: (i) Estimating the probability of cancer in a mammogram given the relevant evidence from a computer-aided cancer detection system, (ii) estimating the probability of drunkenness based on breathalyzer evidence, and (iii & iv) estimating the probability of an enemy sniper based on two different sets of evidence from a drone reconnaissance system. In each case, we quantitatively characterized the contributions of the various potential explanatory variables to the subjects’ probability judgements. We found that while the various explanatory variables together accounted for about 30 to 45% of the overall variance of the subjects’ responses depending on the problem scenario, no single factor was sufficient to account for more than 53% of the explainable variance (or about 16 to 24% of the overall variance), let alone all of it. Further analyses of the explained variance revealed the surprising fact that no single factor accounted for significantly more than its ‘fair share’ of the variance. Taken together, our results demonstrate quantitatively that it is statistically untenable to attribute the errors of probabilistic judgement to any single cause, including base rate neglect. A more nuanced and unifying explanation would be that the actual biases reflect a weighted combination of multiple contributing factors, the exact mix of which depends on the particular problem scenario. Frontiers Media S.A. 2023-03-30 /pmc/articles/PMC10101207/ /pubmed/37063564 http://dx.doi.org/10.3389/fpsyg.2023.1132168 Text en Copyright © 2023 Branch and Hegdé. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Branch, Fallon
Hegdé, Jay
Toward a more nuanced understanding of probability estimation biases
title Toward a more nuanced understanding of probability estimation biases
title_full Toward a more nuanced understanding of probability estimation biases
title_fullStr Toward a more nuanced understanding of probability estimation biases
title_full_unstemmed Toward a more nuanced understanding of probability estimation biases
title_short Toward a more nuanced understanding of probability estimation biases
title_sort toward a more nuanced understanding of probability estimation biases
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10101207/
https://www.ncbi.nlm.nih.gov/pubmed/37063564
http://dx.doi.org/10.3389/fpsyg.2023.1132168
work_keys_str_mv AT branchfallon towardamorenuancedunderstandingofprobabilityestimationbiases
AT hegdejay towardamorenuancedunderstandingofprobabilityestimationbiases