Cargando…

Asymmetry between right and left fundus images identified using convolutional neural networks

We analyzed fundus images to identify whether convolutional neural networks (CNNs) can discriminate between right and left fundus images. We gathered 98,038 fundus photographs from the Gyeongsang National University Changwon Hospital, South Korea, and augmented these with the Ocular Disease Intellig...

Descripción completa

Detalles Bibliográficos
Autores principales: Kang, Tae Seen, Kim, Bum Jun, Nam, Ki Yup, Lee, Seongjin, Kim, Kyonghoon, Lee, Woong-sub, Kim, Jinhyun, Han, Yong Seop
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8795182/
https://www.ncbi.nlm.nih.gov/pubmed/35087071
http://dx.doi.org/10.1038/s41598-021-04323-3
_version_ 1784640995111993344
author Kang, Tae Seen
Kim, Bum Jun
Nam, Ki Yup
Lee, Seongjin
Kim, Kyonghoon
Lee, Woong-sub
Kim, Jinhyun
Han, Yong Seop
author_facet Kang, Tae Seen
Kim, Bum Jun
Nam, Ki Yup
Lee, Seongjin
Kim, Kyonghoon
Lee, Woong-sub
Kim, Jinhyun
Han, Yong Seop
author_sort Kang, Tae Seen
collection PubMed
description We analyzed fundus images to identify whether convolutional neural networks (CNNs) can discriminate between right and left fundus images. We gathered 98,038 fundus photographs from the Gyeongsang National University Changwon Hospital, South Korea, and augmented these with the Ocular Disease Intelligent Recognition dataset. We created eight combinations of image sets to train CNNs. Class activation mapping was used to identify the discriminative image regions used by the CNNs. CNNs identified right and left fundus images with high accuracy (more than 99.3% in the Gyeongsang National University Changwon Hospital dataset and 91.1% in the Ocular Disease Intelligent Recognition dataset) regardless of whether the images were flipped horizontally. The depth and complexity of the CNN affected the accuracy (DenseNet121: 99.91%, ResNet50: 99.86%, and VGG19: 99.37%). DenseNet121 did not discriminate images composed of only left eyes (55.1%, p = 0.548). Class activation mapping identified the macula as the discriminative region used by the CNNs. Several previous studies used the flipping method to augment data in fundus photographs. However, such photographs are distinct from non-flipped images. This asymmetry could result in undesired bias in machine learning. Therefore, when developing a CNN with fundus photographs, care should be taken when applying data augmentation with flipping.
format Online
Article
Text
id pubmed-8795182
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-87951822022-01-28 Asymmetry between right and left fundus images identified using convolutional neural networks Kang, Tae Seen Kim, Bum Jun Nam, Ki Yup Lee, Seongjin Kim, Kyonghoon Lee, Woong-sub Kim, Jinhyun Han, Yong Seop Sci Rep Article We analyzed fundus images to identify whether convolutional neural networks (CNNs) can discriminate between right and left fundus images. We gathered 98,038 fundus photographs from the Gyeongsang National University Changwon Hospital, South Korea, and augmented these with the Ocular Disease Intelligent Recognition dataset. We created eight combinations of image sets to train CNNs. Class activation mapping was used to identify the discriminative image regions used by the CNNs. CNNs identified right and left fundus images with high accuracy (more than 99.3% in the Gyeongsang National University Changwon Hospital dataset and 91.1% in the Ocular Disease Intelligent Recognition dataset) regardless of whether the images were flipped horizontally. The depth and complexity of the CNN affected the accuracy (DenseNet121: 99.91%, ResNet50: 99.86%, and VGG19: 99.37%). DenseNet121 did not discriminate images composed of only left eyes (55.1%, p = 0.548). Class activation mapping identified the macula as the discriminative region used by the CNNs. Several previous studies used the flipping method to augment data in fundus photographs. However, such photographs are distinct from non-flipped images. This asymmetry could result in undesired bias in machine learning. Therefore, when developing a CNN with fundus photographs, care should be taken when applying data augmentation with flipping. Nature Publishing Group UK 2022-01-27 /pmc/articles/PMC8795182/ /pubmed/35087071 http://dx.doi.org/10.1038/s41598-021-04323-3 Text en © The Author(s) 2022, corrected publication 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Kang, Tae Seen
Kim, Bum Jun
Nam, Ki Yup
Lee, Seongjin
Kim, Kyonghoon
Lee, Woong-sub
Kim, Jinhyun
Han, Yong Seop
Asymmetry between right and left fundus images identified using convolutional neural networks
title Asymmetry between right and left fundus images identified using convolutional neural networks
title_full Asymmetry between right and left fundus images identified using convolutional neural networks
title_fullStr Asymmetry between right and left fundus images identified using convolutional neural networks
title_full_unstemmed Asymmetry between right and left fundus images identified using convolutional neural networks
title_short Asymmetry between right and left fundus images identified using convolutional neural networks
title_sort asymmetry between right and left fundus images identified using convolutional neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8795182/
https://www.ncbi.nlm.nih.gov/pubmed/35087071
http://dx.doi.org/10.1038/s41598-021-04323-3
work_keys_str_mv AT kangtaeseen asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT kimbumjun asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT namkiyup asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT leeseongjin asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT kimkyonghoon asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT leewoongsub asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT kimjinhyun asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks
AT hanyongseop asymmetrybetweenrightandleftfundusimagesidentifiedusingconvolutionalneuralnetworks