Cargando…

Annotation-efficient deep learning for automatic medical image segmentation

Automatic medical image segmentation plays a critical role in scientific research and medical care. Existing high-performance deep learning methods typically rely on large training datasets with high-quality manual annotations, which are difficult to obtain in many clinical applications. Here, we in...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Shanshan, Li, Cheng, Wang, Rongpin, Liu, Zaiyi, Wang, Meiyun, Tan, Hongna, Wu, Yaping, Liu, Xinfeng, Sun, Hui, Yang, Rui, Liu, Xin, Chen, Jie, Zhou, Huihui, Ben Ayed, Ismail, Zheng, Hairong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8501087/
https://www.ncbi.nlm.nih.gov/pubmed/34625565
http://dx.doi.org/10.1038/s41467-021-26216-9
Descripción
Sumario:Automatic medical image segmentation plays a critical role in scientific research and medical care. Existing high-performance deep learning methods typically rely on large training datasets with high-quality manual annotations, which are difficult to obtain in many clinical applications. Here, we introduce Annotation-effIcient Deep lEarning (AIDE), an open-source framework to handle imperfect training datasets. Methodological analyses and empirical evaluations are conducted, and we demonstrate that AIDE surpasses conventional fully-supervised models by presenting better performance on open datasets possessing scarce or noisy annotations. We further test AIDE in a real-life case study for breast tumor segmentation. Three datasets containing 11,852 breast images from three medical centers are employed, and AIDE, utilizing 10% training annotations, consistently produces segmentation maps comparable to those generated by fully-supervised counterparts or provided by independent radiologists. The 10-fold enhanced efficiency in utilizing expert labels has the potential to promote a wide range of biomedical applications.