Cargando…
Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis
Artificial intelligence (AI) systems for computer-aided diagnosis and image-based screening are being adopted worldwide by medical institutions. In such a context, generating fair and unbiased classifiers becomes of paramount importance. The research community of medical image computing is making gr...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7293650/ https://www.ncbi.nlm.nih.gov/pubmed/32457147 http://dx.doi.org/10.1073/pnas.1919012117 |
_version_ | 1783546335551553536 |
---|---|
author | Larrazabal, Agostina J. Nieto, Nicolás Peterson, Victoria Milone, Diego H. Ferrante, Enzo |
author_facet | Larrazabal, Agostina J. Nieto, Nicolás Peterson, Victoria Milone, Diego H. Ferrante, Enzo |
author_sort | Larrazabal, Agostina J. |
collection | PubMed |
description | Artificial intelligence (AI) systems for computer-aided diagnosis and image-based screening are being adopted worldwide by medical institutions. In such a context, generating fair and unbiased classifiers becomes of paramount importance. The research community of medical image computing is making great efforts in developing more accurate algorithms to assist medical doctors in the difficult task of disease diagnosis. However, little attention is paid to the way databases are collected and how this may influence the performance of AI systems. Our study sheds light on the importance of gender balance in medical imaging datasets used to train AI systems for computer-assisted diagnosis. We provide empirical evidence supported by a large-scale study, based on three deep neural network architectures and two well-known publicly available X-ray image datasets used to diagnose various thoracic diseases under different gender imbalance conditions. We found a consistent decrease in performance for underrepresented genders when a minimum balance is not fulfilled. This raises the alarm for national agencies in charge of regulating and approving computer-assisted diagnosis systems, which should include explicit gender balance and diversity recommendations. We also establish an open problem for the academic medical image computing community which needs to be addressed by novel algorithms endowed with robustness to gender imbalance. |
format | Online Article Text |
id | pubmed-7293650 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | National Academy of Sciences |
record_format | MEDLINE/PubMed |
spelling | pubmed-72936502020-06-18 Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis Larrazabal, Agostina J. Nieto, Nicolás Peterson, Victoria Milone, Diego H. Ferrante, Enzo Proc Natl Acad Sci U S A Physical Sciences Artificial intelligence (AI) systems for computer-aided diagnosis and image-based screening are being adopted worldwide by medical institutions. In such a context, generating fair and unbiased classifiers becomes of paramount importance. The research community of medical image computing is making great efforts in developing more accurate algorithms to assist medical doctors in the difficult task of disease diagnosis. However, little attention is paid to the way databases are collected and how this may influence the performance of AI systems. Our study sheds light on the importance of gender balance in medical imaging datasets used to train AI systems for computer-assisted diagnosis. We provide empirical evidence supported by a large-scale study, based on three deep neural network architectures and two well-known publicly available X-ray image datasets used to diagnose various thoracic diseases under different gender imbalance conditions. We found a consistent decrease in performance for underrepresented genders when a minimum balance is not fulfilled. This raises the alarm for national agencies in charge of regulating and approving computer-assisted diagnosis systems, which should include explicit gender balance and diversity recommendations. We also establish an open problem for the academic medical image computing community which needs to be addressed by novel algorithms endowed with robustness to gender imbalance. National Academy of Sciences 2020-06-09 2020-05-26 /pmc/articles/PMC7293650/ /pubmed/32457147 http://dx.doi.org/10.1073/pnas.1919012117 Text en Copyright © 2020 the Author(s). Published by PNAS. http://creativecommons.org/licenses/by/4.0/ https://creativecommons.org/licenses/by/4.0/This open access article is distributed under Creative Commons Attribution License 4.0 (CC BY) (http://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Physical Sciences Larrazabal, Agostina J. Nieto, Nicolás Peterson, Victoria Milone, Diego H. Ferrante, Enzo Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title | Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title_full | Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title_fullStr | Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title_full_unstemmed | Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title_short | Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
title_sort | gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis |
topic | Physical Sciences |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7293650/ https://www.ncbi.nlm.nih.gov/pubmed/32457147 http://dx.doi.org/10.1073/pnas.1919012117 |
work_keys_str_mv | AT larrazabalagostinaj genderimbalanceinmedicalimagingdatasetsproducesbiasedclassifiersforcomputeraideddiagnosis AT nietonicolas genderimbalanceinmedicalimagingdatasetsproducesbiasedclassifiersforcomputeraideddiagnosis AT petersonvictoria genderimbalanceinmedicalimagingdatasetsproducesbiasedclassifiersforcomputeraideddiagnosis AT milonediegoh genderimbalanceinmedicalimagingdatasetsproducesbiasedclassifiersforcomputeraideddiagnosis AT ferranteenzo genderimbalanceinmedicalimagingdatasetsproducesbiasedclassifiersforcomputeraideddiagnosis |