Cargando…
Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation
In recent years, image style transfer has been greatly improved by using deep learning technology. However, when directly applied to clothing style transfer, the current methods cannot allow the users to self-control the local transfer position of an image, such as separating specific T-shirt or tro...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7785349/ https://www.ncbi.nlm.nih.gov/pubmed/33456455 http://dx.doi.org/10.1155/2020/8894309 |
_version_ | 1783632423739719680 |
---|---|
author | Wang, Hanying Xiong, Haitao Cai, Yuanyuan |
author_facet | Wang, Hanying Xiong, Haitao Cai, Yuanyuan |
author_sort | Wang, Hanying |
collection | PubMed |
description | In recent years, image style transfer has been greatly improved by using deep learning technology. However, when directly applied to clothing style transfer, the current methods cannot allow the users to self-control the local transfer position of an image, such as separating specific T-shirt or trousers from a figure, and cannot achieve the perfect preservation of clothing shape. Therefore, this paper proposes an interactive image localized style transfer method especially for clothes. We introduce additional image called outline image, which is extracted from content image by interactive algorithm. The interaction consists simply of dragging a rectangle around the desired clothing. Then, we introduce an outline loss function based on distance transform of the outline image, which can achieve the perfect preservation of clothing shape. In order to smooth and denoise the boundary region, total variation regularization is employed. The proposed method constrains that the new style is generated only in the desired clothing part rather than the whole image including background. Therefore, in our new generated images, the original clothing shape can be reserved perfectly. Experiment results show impressive generated clothing images and demonstrate that this is a good approach to design clothes. |
format | Online Article Text |
id | pubmed-7785349 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-77853492021-01-14 Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation Wang, Hanying Xiong, Haitao Cai, Yuanyuan Comput Intell Neurosci Research Article In recent years, image style transfer has been greatly improved by using deep learning technology. However, when directly applied to clothing style transfer, the current methods cannot allow the users to self-control the local transfer position of an image, such as separating specific T-shirt or trousers from a figure, and cannot achieve the perfect preservation of clothing shape. Therefore, this paper proposes an interactive image localized style transfer method especially for clothes. We introduce additional image called outline image, which is extracted from content image by interactive algorithm. The interaction consists simply of dragging a rectangle around the desired clothing. Then, we introduce an outline loss function based on distance transform of the outline image, which can achieve the perfect preservation of clothing shape. In order to smooth and denoise the boundary region, total variation regularization is employed. The proposed method constrains that the new style is generated only in the desired clothing part rather than the whole image including background. Therefore, in our new generated images, the original clothing shape can be reserved perfectly. Experiment results show impressive generated clothing images and demonstrate that this is a good approach to design clothes. Hindawi 2020-12-28 /pmc/articles/PMC7785349/ /pubmed/33456455 http://dx.doi.org/10.1155/2020/8894309 Text en Copyright © 2020 Hanying Wang et al. https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Wang, Hanying Xiong, Haitao Cai, Yuanyuan Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title | Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title_full | Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title_fullStr | Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title_full_unstemmed | Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title_short | Image Localized Style Transfer to Design Clothes Based on CNN and Interactive Segmentation |
title_sort | image localized style transfer to design clothes based on cnn and interactive segmentation |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7785349/ https://www.ncbi.nlm.nih.gov/pubmed/33456455 http://dx.doi.org/10.1155/2020/8894309 |
work_keys_str_mv | AT wanghanying imagelocalizedstyletransfertodesignclothesbasedoncnnandinteractivesegmentation AT xionghaitao imagelocalizedstyletransfertodesignclothesbasedoncnnandinteractivesegmentation AT caiyuanyuan imagelocalizedstyletransfertodesignclothesbasedoncnnandinteractivesegmentation |