Cargando…
A Foreground-Aware Framework for Local Face Attribute Transfer
In the context of social media, large amounts of headshot photos are taken everyday. Unfortunately, in addition to laborious editing and modification, creating a visually compelling photographic masterpiece for sharing requires advanced professional skills, which are difficult for ordinary Internet...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8156749/ https://www.ncbi.nlm.nih.gov/pubmed/34065640 http://dx.doi.org/10.3390/e23050615 |
_version_ | 1783699520066945024 |
---|---|
author | Fu, Yuanbin Ma, Jiayi Guo, Xiaojie |
author_facet | Fu, Yuanbin Ma, Jiayi Guo, Xiaojie |
author_sort | Fu, Yuanbin |
collection | PubMed |
description | In the context of social media, large amounts of headshot photos are taken everyday. Unfortunately, in addition to laborious editing and modification, creating a visually compelling photographic masterpiece for sharing requires advanced professional skills, which are difficult for ordinary Internet users. Though there are many algorithms automatically and globally transferring the style from one image to another, they fail to respect the semantics of the scene and are unable to allow users to merely transfer the attributes of one or two face organs in the foreground region leaving the background region unchanged. To overcome this problem, we developed a novel framework for semantically meaningful local face attribute transfer, which can flexibly transfer the local attribute of a face organ from the reference image to a semantically equivalent organ in the input image, while preserving the background. Our method involves warping the reference photo to match the shape, pose, location, and expression of the input image. The fusion of the warped reference image and input image is then taken as the initialized image for a neural style transfer algorithm. Our method achieves better performance in terms of inception score (3.81) and Fréchet inception distance (80.31), which is about 10% higher than those of competitors, indicating that our framework is capable of producing high-quality and photorealistic attribute transfer results. Both theoretical findings and experimental results are provided to demonstrate the efficacy of the proposed framework, reveal its superiority over other state-of-the-art alternatives. |
format | Online Article Text |
id | pubmed-8156749 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-81567492021-05-28 A Foreground-Aware Framework for Local Face Attribute Transfer Fu, Yuanbin Ma, Jiayi Guo, Xiaojie Entropy (Basel) Article In the context of social media, large amounts of headshot photos are taken everyday. Unfortunately, in addition to laborious editing and modification, creating a visually compelling photographic masterpiece for sharing requires advanced professional skills, which are difficult for ordinary Internet users. Though there are many algorithms automatically and globally transferring the style from one image to another, they fail to respect the semantics of the scene and are unable to allow users to merely transfer the attributes of one or two face organs in the foreground region leaving the background region unchanged. To overcome this problem, we developed a novel framework for semantically meaningful local face attribute transfer, which can flexibly transfer the local attribute of a face organ from the reference image to a semantically equivalent organ in the input image, while preserving the background. Our method involves warping the reference photo to match the shape, pose, location, and expression of the input image. The fusion of the warped reference image and input image is then taken as the initialized image for a neural style transfer algorithm. Our method achieves better performance in terms of inception score (3.81) and Fréchet inception distance (80.31), which is about 10% higher than those of competitors, indicating that our framework is capable of producing high-quality and photorealistic attribute transfer results. Both theoretical findings and experimental results are provided to demonstrate the efficacy of the proposed framework, reveal its superiority over other state-of-the-art alternatives. MDPI 2021-05-16 /pmc/articles/PMC8156749/ /pubmed/34065640 http://dx.doi.org/10.3390/e23050615 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Fu, Yuanbin Ma, Jiayi Guo, Xiaojie A Foreground-Aware Framework for Local Face Attribute Transfer |
title | A Foreground-Aware Framework for Local Face Attribute Transfer |
title_full | A Foreground-Aware Framework for Local Face Attribute Transfer |
title_fullStr | A Foreground-Aware Framework for Local Face Attribute Transfer |
title_full_unstemmed | A Foreground-Aware Framework for Local Face Attribute Transfer |
title_short | A Foreground-Aware Framework for Local Face Attribute Transfer |
title_sort | foreground-aware framework for local face attribute transfer |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8156749/ https://www.ncbi.nlm.nih.gov/pubmed/34065640 http://dx.doi.org/10.3390/e23050615 |
work_keys_str_mv | AT fuyuanbin aforegroundawareframeworkforlocalfaceattributetransfer AT majiayi aforegroundawareframeworkforlocalfaceattributetransfer AT guoxiaojie aforegroundawareframeworkforlocalfaceattributetransfer AT fuyuanbin foregroundawareframeworkforlocalfaceattributetransfer AT majiayi foregroundawareframeworkforlocalfaceattributetransfer AT guoxiaojie foregroundawareframeworkforlocalfaceattributetransfer |