Cargando…

Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning

BACKGROUND: Interactive echocardiography translation is an efficient educational function to master cardiac anatomy. It strengthens the student's understanding by pixel-level translation between echocardiography and theoretically sketch images. Previous research studies split it into two aspect...

Descripción completa

Detalles Bibliográficos
Autores principales: Teng, Long, Fu, ZhongLiang, Ma, Qian, Yao, Yu, Zhang, Bing, Zhu, Kai, Li, Ping
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7106869/
https://www.ncbi.nlm.nih.gov/pubmed/32256680
http://dx.doi.org/10.1155/2020/1487035
_version_ 1783512705529806848
author Teng, Long
Fu, ZhongLiang
Ma, Qian
Yao, Yu
Zhang, Bing
Zhu, Kai
Li, Ping
author_facet Teng, Long
Fu, ZhongLiang
Ma, Qian
Yao, Yu
Zhang, Bing
Zhu, Kai
Li, Ping
author_sort Teng, Long
collection PubMed
description BACKGROUND: Interactive echocardiography translation is an efficient educational function to master cardiac anatomy. It strengthens the student's understanding by pixel-level translation between echocardiography and theoretically sketch images. Previous research studies split it into two aspects of image segmentation and synthesis. This split makes it hard to achieve pixel-level corresponding translation. Besides, it is also challenging to leverage deep-learning-based methods in each phase where a handful of annotations are available. METHODS: To address interactive translation with limited annotations, we present a two-step transfer learning approach. Firstly, we train two independent parent networks, the ultrasound to sketch (U2S) parent network and the sketch to ultrasound (S2U) parent network. U2S translation is similar to a segmentation task with sector boundary inference. Therefore, the U2S parent network is trained with the U-Net network on the public segmentation dataset of VOC2012. S2U aims at recovering ultrasound texture. So, the S2U parent network is decoder networks that generate ultrasound data from random input. After pretraining the parent networks, an encoder network is attached to the S2U parent network to translate ultrasound images into sketch images. We jointly transfer learning U2S and S2U within the CGAN framework. Results and conclusion. Quantitative and qualitative contrast from 1-shot, 5-shot, and 10-shot transfer learning show the effectiveness of the proposed algorithm. The interactive translation is achieved with few-shot transfer learning. Thus, the development of new applications from scratch is accelerated. Our few-shot transfer learning has great potential in the biomedical computer-aided image translation field, where annotation data are extremely precious.
format Online
Article
Text
id pubmed-7106869
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-71068692020-04-02 Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning Teng, Long Fu, ZhongLiang Ma, Qian Yao, Yu Zhang, Bing Zhu, Kai Li, Ping Comput Math Methods Med Research Article BACKGROUND: Interactive echocardiography translation is an efficient educational function to master cardiac anatomy. It strengthens the student's understanding by pixel-level translation between echocardiography and theoretically sketch images. Previous research studies split it into two aspects of image segmentation and synthesis. This split makes it hard to achieve pixel-level corresponding translation. Besides, it is also challenging to leverage deep-learning-based methods in each phase where a handful of annotations are available. METHODS: To address interactive translation with limited annotations, we present a two-step transfer learning approach. Firstly, we train two independent parent networks, the ultrasound to sketch (U2S) parent network and the sketch to ultrasound (S2U) parent network. U2S translation is similar to a segmentation task with sector boundary inference. Therefore, the U2S parent network is trained with the U-Net network on the public segmentation dataset of VOC2012. S2U aims at recovering ultrasound texture. So, the S2U parent network is decoder networks that generate ultrasound data from random input. After pretraining the parent networks, an encoder network is attached to the S2U parent network to translate ultrasound images into sketch images. We jointly transfer learning U2S and S2U within the CGAN framework. Results and conclusion. Quantitative and qualitative contrast from 1-shot, 5-shot, and 10-shot transfer learning show the effectiveness of the proposed algorithm. The interactive translation is achieved with few-shot transfer learning. Thus, the development of new applications from scratch is accelerated. Our few-shot transfer learning has great potential in the biomedical computer-aided image translation field, where annotation data are extremely precious. Hindawi 2020-03-19 /pmc/articles/PMC7106869/ /pubmed/32256680 http://dx.doi.org/10.1155/2020/1487035 Text en Copyright © 2020 Long Teng et al. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Teng, Long
Fu, ZhongLiang
Ma, Qian
Yao, Yu
Zhang, Bing
Zhu, Kai
Li, Ping
Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title_full Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title_fullStr Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title_full_unstemmed Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title_short Interactive Echocardiography Translation Using Few-Shot GAN Transfer Learning
title_sort interactive echocardiography translation using few-shot gan transfer learning
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7106869/
https://www.ncbi.nlm.nih.gov/pubmed/32256680
http://dx.doi.org/10.1155/2020/1487035
work_keys_str_mv AT tenglong interactiveechocardiographytranslationusingfewshotgantransferlearning
AT fuzhongliang interactiveechocardiographytranslationusingfewshotgantransferlearning
AT maqian interactiveechocardiographytranslationusingfewshotgantransferlearning
AT yaoyu interactiveechocardiographytranslationusingfewshotgantransferlearning
AT zhangbing interactiveechocardiographytranslationusingfewshotgantransferlearning
AT zhukai interactiveechocardiographytranslationusingfewshotgantransferlearning
AT liping interactiveechocardiographytranslationusingfewshotgantransferlearning