Cargando…

Learning deep abdominal CT registration through adaptive loss weighting and synthetic data generation

PURPOSE: This study aims to explore training strategies to improve convolutional neural network-based image-to-image deformable registration for abdominal imaging. METHODS: Different training strategies, loss functions, and transfer learning schemes were considered. Furthermore, an augmentation laye...

Descripción completa

Detalles Bibliográficos
Autores principales: Pérez de Frutos, Javier, Pedersen, André, Pelanis, Egidijus, Bouget, David, Survarachakan, Shanmugapriya, Langø, Thomas, Elle, Ole-Jakob, Lindseth, Frank
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9956065/
https://www.ncbi.nlm.nih.gov/pubmed/36827289
http://dx.doi.org/10.1371/journal.pone.0282110
Descripción
Sumario:PURPOSE: This study aims to explore training strategies to improve convolutional neural network-based image-to-image deformable registration for abdominal imaging. METHODS: Different training strategies, loss functions, and transfer learning schemes were considered. Furthermore, an augmentation layer which generates artificial training image pairs on-the-fly was proposed, in addition to a loss layer that enables dynamic loss weighting. RESULTS: Guiding registration using segmentations in the training step proved beneficial for deep-learning-based image registration. Finetuning the pretrained model from the brain MRI dataset to the abdominal CT dataset further improved performance on the latter application, removing the need for a large dataset to yield satisfactory performance. Dynamic loss weighting also marginally improved performance, all without impacting inference runtime. CONCLUSION: Using simple concepts, we improved the performance of a commonly used deep image registration architecture, VoxelMorph. In future work, our framework, DDMR, should be validated on different datasets to further assess its value.