Cargando…

Anatomy-aware self-supervised learning for anomaly detection in chest radiographs

In this study, we present a self-supervised learning (SSL)-based model that enables anatomical structure-based unsupervised anomaly detection (UAD). The model employs an anatomy-aware pasting (AnatPaste) augmentation tool that uses a threshold-based lung segmentation pretext task to create anomalies...

Descripción completa

Detalles Bibliográficos
Autores principales: Sato, Junya, Suzuki, Yuki, Wataya, Tomohiro, Nishigaki, Daiki, Kita, Kosuke, Yamagata, Kazuki, Tomiyama, Noriyuki, Kido, Shoji
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10331430/
https://www.ncbi.nlm.nih.gov/pubmed/37434699
http://dx.doi.org/10.1016/j.isci.2023.107086
Descripción
Sumario:In this study, we present a self-supervised learning (SSL)-based model that enables anatomical structure-based unsupervised anomaly detection (UAD). The model employs an anatomy-aware pasting (AnatPaste) augmentation tool that uses a threshold-based lung segmentation pretext task to create anomalies in normal chest radiographs used for model pretraining. These anomalies are similar to real anomalies and help the model recognize them. We evaluate our model using three open-source chest radiograph datasets. Our model exhibits area under curves of 92.1%, 78.7%, and 81.9%, which are the highest among those of existing UAD models. To the best of our knowledge, this is the first SSL model to employ anatomical information from segmentation as a pretext task. The performance of AnatPaste shows that incorporating anatomical information into SSL can effectively improve accuracy.