Cargando…
Global healthcare fairness: We should be sharing more, not less, data
The availability of large, deidentified health datasets has enabled significant innovation in using machine learning (ML) to better understand patients and their diseases. However, questions remain regarding the true privacy of this data, patient control over their data, and how we regulate data sha...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9931202/ https://www.ncbi.nlm.nih.gov/pubmed/36812599 http://dx.doi.org/10.1371/journal.pdig.0000102 |
_version_ | 1784889195941068800 |
---|---|
author | Seastedt, Kenneth P. Schwab, Patrick O’Brien, Zach Wakida, Edith Herrera, Karen Marcelo, Portia Grace F. Agha-Mir-Salim, Louis Frigola, Xavier Borrat Ndulue, Emily Boardman Marcelo, Alvin Celi, Leo Anthony |
author_facet | Seastedt, Kenneth P. Schwab, Patrick O’Brien, Zach Wakida, Edith Herrera, Karen Marcelo, Portia Grace F. Agha-Mir-Salim, Louis Frigola, Xavier Borrat Ndulue, Emily Boardman Marcelo, Alvin Celi, Leo Anthony |
author_sort | Seastedt, Kenneth P. |
collection | PubMed |
description | The availability of large, deidentified health datasets has enabled significant innovation in using machine learning (ML) to better understand patients and their diseases. However, questions remain regarding the true privacy of this data, patient control over their data, and how we regulate data sharing in a way that that does not encumber progress or further potentiate biases for underrepresented populations. After reviewing the literature on potential reidentifications of patients in publicly available datasets, we argue that the cost—measured in terms of access to future medical innovations and clinical software—of slowing ML progress is too great to limit sharing data through large publicly available databases for concerns of imperfect data anonymization. This cost is especially great for developing countries where the barriers preventing inclusion in such databases will continue to rise, further excluding these populations and increasing existing biases that favor high-income countries. Preventing artificial intelligence’s progress towards precision medicine and sliding back to clinical practice dogma may pose a larger threat than concerns of potential patient reidentification within publicly available datasets. While the risk to patient privacy should be minimized, we believe this risk will never be zero, and society has to determine an acceptable risk threshold below which data sharing can occur—for the benefit of a global medical knowledge system. |
format | Online Article Text |
id | pubmed-9931202 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-99312022023-02-16 Global healthcare fairness: We should be sharing more, not less, data Seastedt, Kenneth P. Schwab, Patrick O’Brien, Zach Wakida, Edith Herrera, Karen Marcelo, Portia Grace F. Agha-Mir-Salim, Louis Frigola, Xavier Borrat Ndulue, Emily Boardman Marcelo, Alvin Celi, Leo Anthony PLOS Digit Health Review The availability of large, deidentified health datasets has enabled significant innovation in using machine learning (ML) to better understand patients and their diseases. However, questions remain regarding the true privacy of this data, patient control over their data, and how we regulate data sharing in a way that that does not encumber progress or further potentiate biases for underrepresented populations. After reviewing the literature on potential reidentifications of patients in publicly available datasets, we argue that the cost—measured in terms of access to future medical innovations and clinical software—of slowing ML progress is too great to limit sharing data through large publicly available databases for concerns of imperfect data anonymization. This cost is especially great for developing countries where the barriers preventing inclusion in such databases will continue to rise, further excluding these populations and increasing existing biases that favor high-income countries. Preventing artificial intelligence’s progress towards precision medicine and sliding back to clinical practice dogma may pose a larger threat than concerns of potential patient reidentification within publicly available datasets. While the risk to patient privacy should be minimized, we believe this risk will never be zero, and society has to determine an acceptable risk threshold below which data sharing can occur—for the benefit of a global medical knowledge system. Public Library of Science 2022-10-06 /pmc/articles/PMC9931202/ /pubmed/36812599 http://dx.doi.org/10.1371/journal.pdig.0000102 Text en © 2022 Seastedt et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Review Seastedt, Kenneth P. Schwab, Patrick O’Brien, Zach Wakida, Edith Herrera, Karen Marcelo, Portia Grace F. Agha-Mir-Salim, Louis Frigola, Xavier Borrat Ndulue, Emily Boardman Marcelo, Alvin Celi, Leo Anthony Global healthcare fairness: We should be sharing more, not less, data |
title | Global healthcare fairness: We should be sharing more, not less, data |
title_full | Global healthcare fairness: We should be sharing more, not less, data |
title_fullStr | Global healthcare fairness: We should be sharing more, not less, data |
title_full_unstemmed | Global healthcare fairness: We should be sharing more, not less, data |
title_short | Global healthcare fairness: We should be sharing more, not less, data |
title_sort | global healthcare fairness: we should be sharing more, not less, data |
topic | Review |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9931202/ https://www.ncbi.nlm.nih.gov/pubmed/36812599 http://dx.doi.org/10.1371/journal.pdig.0000102 |
work_keys_str_mv | AT seastedtkennethp globalhealthcarefairnessweshouldbesharingmorenotlessdata AT schwabpatrick globalhealthcarefairnessweshouldbesharingmorenotlessdata AT obrienzach globalhealthcarefairnessweshouldbesharingmorenotlessdata AT wakidaedith globalhealthcarefairnessweshouldbesharingmorenotlessdata AT herrerakaren globalhealthcarefairnessweshouldbesharingmorenotlessdata AT marceloportiagracef globalhealthcarefairnessweshouldbesharingmorenotlessdata AT aghamirsalimlouis globalhealthcarefairnessweshouldbesharingmorenotlessdata AT frigolaxavierborrat globalhealthcarefairnessweshouldbesharingmorenotlessdata AT ndulueemilyboardman globalhealthcarefairnessweshouldbesharingmorenotlessdata AT marceloalvin globalhealthcarefairnessweshouldbesharingmorenotlessdata AT celileoanthony globalhealthcarefairnessweshouldbesharingmorenotlessdata |