Cargando…

Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study

BACKGROUND: Deep learning algorithms have been built for the detection of systemic and eye diseases based on fundus photographs. The retina possesses features that can be affected by gender differences, and the extent to which these features are captured via photography differs depending on the reti...

Descripción completa

Detalles Bibliográficos
Autores principales: Betzler, Bjorn Kaijun, Yang, Henrik Hee Seung, Thakur, Sahil, Yu, Marco, Quek, Ten Cheer, Soh, Zhi Da, Lee, Geunyoung, Tham, Yih-Chung, Wong, Tien Yin, Rim, Tyler Hyungtaek, Cheng, Ching-Yu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8408758/
https://www.ncbi.nlm.nih.gov/pubmed/34402800
http://dx.doi.org/10.2196/25165
_version_ 1783746876093235200
author Betzler, Bjorn Kaijun
Yang, Henrik Hee Seung
Thakur, Sahil
Yu, Marco
Quek, Ten Cheer
Soh, Zhi Da
Lee, Geunyoung
Tham, Yih-Chung
Wong, Tien Yin
Rim, Tyler Hyungtaek
Cheng, Ching-Yu
author_facet Betzler, Bjorn Kaijun
Yang, Henrik Hee Seung
Thakur, Sahil
Yu, Marco
Quek, Ten Cheer
Soh, Zhi Da
Lee, Geunyoung
Tham, Yih-Chung
Wong, Tien Yin
Rim, Tyler Hyungtaek
Cheng, Ching-Yu
author_sort Betzler, Bjorn Kaijun
collection PubMed
description BACKGROUND: Deep learning algorithms have been built for the detection of systemic and eye diseases based on fundus photographs. The retina possesses features that can be affected by gender differences, and the extent to which these features are captured via photography differs depending on the retinal image field. OBJECTIVE: We aimed to compare deep learning algorithms’ performance in predicting gender based on different fields of fundus photographs (optic disc–centered, macula-centered, and peripheral fields). METHODS: This retrospective cross-sectional study included 172,170 fundus photographs of 9956 adults aged ≥40 years from the Singapore Epidemiology of Eye Diseases Study. Optic disc–centered, macula-centered, and peripheral field fundus images were included in this study as input data for a deep learning model for gender prediction. Performance was estimated at the individual level and image level. Receiver operating characteristic curves for binary classification were calculated. RESULTS: The deep learning algorithms predicted gender with an area under the receiver operating characteristic curve (AUC) of 0.94 at the individual level and an AUC of 0.87 at the image level. Across the three image field types, the best performance was seen when using optic disc–centered field images (younger subgroups: AUC=0.91; older subgroups: AUC=0.86), and algorithms that used peripheral field images had the lowest performance (younger subgroups: AUC=0.85; older subgroups: AUC=0.76). Across the three ethnic subgroups, algorithm performance was lowest in the Indian subgroup (AUC=0.88) compared to that in the Malay (AUC=0.91) and Chinese (AUC=0.91) subgroups when the algorithms were tested on optic disc–centered images. Algorithms’ performance in gender prediction at the image level was better in younger subgroups (aged <65 years; AUC=0.89) than in older subgroups (aged ≥65 years; AUC=0.82). CONCLUSIONS: We confirmed that gender among the Asian population can be predicted with fundus photographs by using deep learning, and our algorithms’ performance in terms of gender prediction differed according to the field of fundus photographs, age subgroups, and ethnic groups. Our work provides a further understanding of using deep learning models for the prediction of gender-related diseases. Further validation of our findings is still needed.
format Online
Article
Text
id pubmed-8408758
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-84087582021-09-14 Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study Betzler, Bjorn Kaijun Yang, Henrik Hee Seung Thakur, Sahil Yu, Marco Quek, Ten Cheer Soh, Zhi Da Lee, Geunyoung Tham, Yih-Chung Wong, Tien Yin Rim, Tyler Hyungtaek Cheng, Ching-Yu JMIR Med Inform Original Paper BACKGROUND: Deep learning algorithms have been built for the detection of systemic and eye diseases based on fundus photographs. The retina possesses features that can be affected by gender differences, and the extent to which these features are captured via photography differs depending on the retinal image field. OBJECTIVE: We aimed to compare deep learning algorithms’ performance in predicting gender based on different fields of fundus photographs (optic disc–centered, macula-centered, and peripheral fields). METHODS: This retrospective cross-sectional study included 172,170 fundus photographs of 9956 adults aged ≥40 years from the Singapore Epidemiology of Eye Diseases Study. Optic disc–centered, macula-centered, and peripheral field fundus images were included in this study as input data for a deep learning model for gender prediction. Performance was estimated at the individual level and image level. Receiver operating characteristic curves for binary classification were calculated. RESULTS: The deep learning algorithms predicted gender with an area under the receiver operating characteristic curve (AUC) of 0.94 at the individual level and an AUC of 0.87 at the image level. Across the three image field types, the best performance was seen when using optic disc–centered field images (younger subgroups: AUC=0.91; older subgroups: AUC=0.86), and algorithms that used peripheral field images had the lowest performance (younger subgroups: AUC=0.85; older subgroups: AUC=0.76). Across the three ethnic subgroups, algorithm performance was lowest in the Indian subgroup (AUC=0.88) compared to that in the Malay (AUC=0.91) and Chinese (AUC=0.91) subgroups when the algorithms were tested on optic disc–centered images. Algorithms’ performance in gender prediction at the image level was better in younger subgroups (aged <65 years; AUC=0.89) than in older subgroups (aged ≥65 years; AUC=0.82). CONCLUSIONS: We confirmed that gender among the Asian population can be predicted with fundus photographs by using deep learning, and our algorithms’ performance in terms of gender prediction differed according to the field of fundus photographs, age subgroups, and ethnic groups. Our work provides a further understanding of using deep learning models for the prediction of gender-related diseases. Further validation of our findings is still needed. JMIR Publications 2021-08-17 /pmc/articles/PMC8408758/ /pubmed/34402800 http://dx.doi.org/10.2196/25165 Text en ©Bjorn Kaijun Betzler, Henrik Hee Seung Yang, Sahil Thakur, Marco Yu, Ten Cheer Quek, Zhi Da Soh, Geunyoung Lee, Yih-Chung Tham, Tien Yin Wong, Tyler Hyungtaek Rim, Ching-Yu Cheng. Originally published in JMIR Medical Informatics (https://medinform.jmir.org), 17.08.2021. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Informatics, is properly cited. The complete bibliographic information, a link to the original publication on https://medinform.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Betzler, Bjorn Kaijun
Yang, Henrik Hee Seung
Thakur, Sahil
Yu, Marco
Quek, Ten Cheer
Soh, Zhi Da
Lee, Geunyoung
Tham, Yih-Chung
Wong, Tien Yin
Rim, Tyler Hyungtaek
Cheng, Ching-Yu
Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title_full Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title_fullStr Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title_full_unstemmed Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title_short Gender Prediction for a Multiethnic Population via Deep Learning Across Different Retinal Fundus Photograph Fields: Retrospective Cross-sectional Study
title_sort gender prediction for a multiethnic population via deep learning across different retinal fundus photograph fields: retrospective cross-sectional study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8408758/
https://www.ncbi.nlm.nih.gov/pubmed/34402800
http://dx.doi.org/10.2196/25165
work_keys_str_mv AT betzlerbjornkaijun genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT yanghenrikheeseung genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT thakursahil genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT yumarco genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT quektencheer genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT sohzhida genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT leegeunyoung genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT thamyihchung genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT wongtienyin genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT rimtylerhyungtaek genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy
AT chengchingyu genderpredictionforamultiethnicpopulationviadeeplearningacrossdifferentretinalfundusphotographfieldsretrospectivecrosssectionalstudy