Cargando…
Deep learning for fast low-field MRI acquisitions
Low-field (LF) MRI research currently gains momentum from its potential to offer reduced costs and reduced footprints translating into wider accessibility. However, the impeded signal-to-noise ratio inherent to lower magnetic fields can have a significant impact on acquisition times that challenges...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9259619/ https://www.ncbi.nlm.nih.gov/pubmed/35794175 http://dx.doi.org/10.1038/s41598-022-14039-7 |
_version_ | 1784741825748140032 |
---|---|
author | Ayde, Reina Senft, Tobias Salameh, Najat Sarracanie, Mathieu |
author_facet | Ayde, Reina Senft, Tobias Salameh, Najat Sarracanie, Mathieu |
author_sort | Ayde, Reina |
collection | PubMed |
description | Low-field (LF) MRI research currently gains momentum from its potential to offer reduced costs and reduced footprints translating into wider accessibility. However, the impeded signal-to-noise ratio inherent to lower magnetic fields can have a significant impact on acquisition times that challenges LF clinical relevance. Undersampling is an effective way to speed up acquisitions in MRI, and recent work has shown encouraging results when combined with deep learning (DL). Yet, training DL models generally requires large databases that are not yet available at LF regimes. Here, we demonstrate the capability of Residual U-net combined with data augmentation to reconstruct magnitude and phase information of undersampled LF MRI scans at 0.1 T with a limited training dataset (n = 10). The model performance was first evaluated in a retrospective study for different acceleration rates and sampling patterns. Ultimately, the DL approach was validated on prospectively acquired, fivefold undersampled LF data. With varying performances associated to the adopted sampling scheme, our results show that the approach investigated can preserve the global structure and the details sharpness in the reconstructed magnitude and phase images. Overall, promising results could be obtained on acquired LF MR images that may bring this research closer to clinical implementation. |
format | Online Article Text |
id | pubmed-9259619 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-92596192022-07-08 Deep learning for fast low-field MRI acquisitions Ayde, Reina Senft, Tobias Salameh, Najat Sarracanie, Mathieu Sci Rep Article Low-field (LF) MRI research currently gains momentum from its potential to offer reduced costs and reduced footprints translating into wider accessibility. However, the impeded signal-to-noise ratio inherent to lower magnetic fields can have a significant impact on acquisition times that challenges LF clinical relevance. Undersampling is an effective way to speed up acquisitions in MRI, and recent work has shown encouraging results when combined with deep learning (DL). Yet, training DL models generally requires large databases that are not yet available at LF regimes. Here, we demonstrate the capability of Residual U-net combined with data augmentation to reconstruct magnitude and phase information of undersampled LF MRI scans at 0.1 T with a limited training dataset (n = 10). The model performance was first evaluated in a retrospective study for different acceleration rates and sampling patterns. Ultimately, the DL approach was validated on prospectively acquired, fivefold undersampled LF data. With varying performances associated to the adopted sampling scheme, our results show that the approach investigated can preserve the global structure and the details sharpness in the reconstructed magnitude and phase images. Overall, promising results could be obtained on acquired LF MR images that may bring this research closer to clinical implementation. Nature Publishing Group UK 2022-07-06 /pmc/articles/PMC9259619/ /pubmed/35794175 http://dx.doi.org/10.1038/s41598-022-14039-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Ayde, Reina Senft, Tobias Salameh, Najat Sarracanie, Mathieu Deep learning for fast low-field MRI acquisitions |
title | Deep learning for fast low-field MRI acquisitions |
title_full | Deep learning for fast low-field MRI acquisitions |
title_fullStr | Deep learning for fast low-field MRI acquisitions |
title_full_unstemmed | Deep learning for fast low-field MRI acquisitions |
title_short | Deep learning for fast low-field MRI acquisitions |
title_sort | deep learning for fast low-field mri acquisitions |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9259619/ https://www.ncbi.nlm.nih.gov/pubmed/35794175 http://dx.doi.org/10.1038/s41598-022-14039-7 |
work_keys_str_mv | AT aydereina deeplearningforfastlowfieldmriacquisitions AT senfttobias deeplearningforfastlowfieldmriacquisitions AT salamehnajat deeplearningforfastlowfieldmriacquisitions AT sarracaniemathieu deeplearningforfastlowfieldmriacquisitions |