Cargando…
Change detection based on unsupervised sparse representation for fundus image pair
Detecting changes is an important issue for ophthalmology to compare longitudinal fundus images at different stages and obtain change regions. Illumination variations bring distractions on the change regions by the pixel-by-pixel comparison. In this paper, a new unsupervised change detection method...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9197950/ https://www.ncbi.nlm.nih.gov/pubmed/35701500 http://dx.doi.org/10.1038/s41598-022-13754-5 |
_version_ | 1784727524253630464 |
---|---|
author | Fu, Yinghua Zhao, Xing Liang, Yong Zhao, Tiejun Wang, Chaoli Zhang, Dawei |
author_facet | Fu, Yinghua Zhao, Xing Liang, Yong Zhao, Tiejun Wang, Chaoli Zhang, Dawei |
author_sort | Fu, Yinghua |
collection | PubMed |
description | Detecting changes is an important issue for ophthalmology to compare longitudinal fundus images at different stages and obtain change regions. Illumination variations bring distractions on the change regions by the pixel-by-pixel comparison. In this paper, a new unsupervised change detection method based on sparse representation classification (SRC) is proposed for the fundus image pair. First, the local neighborhood patches are extracted from the reference image to build a dictionary of the local background. Then the current image patch is represented sparsely and its background is reconstructed by the obtained dictionary. Finally, change regions are given through background subtracting. The SRC method can correct automatically illumination variations through the representation coefficients and filter local contrast and global intensity effectively. In experiments of this paper, the AUC and mAP values of SRC method are 0.9858 and 0.8647 respectively for the image pair with small lesions; the AUC and mAP values of the fusion method of IRHSF and SRC are 0.9892 and 0.9692 separately for the image pair with the big change region. Experiments show that the proposed method in this paper is more robust than RPCA for the illumination variations and can detect change regions more effectively than pixel-wised image differencing. |
format | Online Article Text |
id | pubmed-9197950 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-91979502022-06-16 Change detection based on unsupervised sparse representation for fundus image pair Fu, Yinghua Zhao, Xing Liang, Yong Zhao, Tiejun Wang, Chaoli Zhang, Dawei Sci Rep Article Detecting changes is an important issue for ophthalmology to compare longitudinal fundus images at different stages and obtain change regions. Illumination variations bring distractions on the change regions by the pixel-by-pixel comparison. In this paper, a new unsupervised change detection method based on sparse representation classification (SRC) is proposed for the fundus image pair. First, the local neighborhood patches are extracted from the reference image to build a dictionary of the local background. Then the current image patch is represented sparsely and its background is reconstructed by the obtained dictionary. Finally, change regions are given through background subtracting. The SRC method can correct automatically illumination variations through the representation coefficients and filter local contrast and global intensity effectively. In experiments of this paper, the AUC and mAP values of SRC method are 0.9858 and 0.8647 respectively for the image pair with small lesions; the AUC and mAP values of the fusion method of IRHSF and SRC are 0.9892 and 0.9692 separately for the image pair with the big change region. Experiments show that the proposed method in this paper is more robust than RPCA for the illumination variations and can detect change regions more effectively than pixel-wised image differencing. Nature Publishing Group UK 2022-06-14 /pmc/articles/PMC9197950/ /pubmed/35701500 http://dx.doi.org/10.1038/s41598-022-13754-5 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Fu, Yinghua Zhao, Xing Liang, Yong Zhao, Tiejun Wang, Chaoli Zhang, Dawei Change detection based on unsupervised sparse representation for fundus image pair |
title | Change detection based on unsupervised sparse representation for fundus image pair |
title_full | Change detection based on unsupervised sparse representation for fundus image pair |
title_fullStr | Change detection based on unsupervised sparse representation for fundus image pair |
title_full_unstemmed | Change detection based on unsupervised sparse representation for fundus image pair |
title_short | Change detection based on unsupervised sparse representation for fundus image pair |
title_sort | change detection based on unsupervised sparse representation for fundus image pair |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9197950/ https://www.ncbi.nlm.nih.gov/pubmed/35701500 http://dx.doi.org/10.1038/s41598-022-13754-5 |
work_keys_str_mv | AT fuyinghua changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair AT zhaoxing changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair AT liangyong changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair AT zhaotiejun changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair AT wangchaoli changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair AT zhangdawei changedetectionbasedonunsupervisedsparserepresentationforfundusimagepair |