Cargando…

Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study

BACKGROUND: Cancer has become the second leading cause of death globally. Most cancer cases are due to genetic mutations, which affect metabolism and result in facial changes. OBJECTIVE: In this study, we aimed to identify the facial features of patients with cancer using the deep learning technique...

Descripción completa

Detalles Bibliográficos
Autores principales: Liang, Bin, Yang, Na, He, Guosheng, Huang, Peng, Yang, Yong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7221634/
https://www.ncbi.nlm.nih.gov/pubmed/32347802
http://dx.doi.org/10.2196/17234
_version_ 1783533405838770176
author Liang, Bin
Yang, Na
He, Guosheng
Huang, Peng
Yang, Yong
author_facet Liang, Bin
Yang, Na
He, Guosheng
Huang, Peng
Yang, Yong
author_sort Liang, Bin
collection PubMed
description BACKGROUND: Cancer has become the second leading cause of death globally. Most cancer cases are due to genetic mutations, which affect metabolism and result in facial changes. OBJECTIVE: In this study, we aimed to identify the facial features of patients with cancer using the deep learning technique. METHODS: Images of faces of patients with cancer were collected to build the cancer face image data set. A face image data set of people without cancer was built by randomly selecting images from the publicly available MegaAge data set according to the sex and age distribution of the cancer face image data set. Each face image was preprocessed to obtain an upright centered face chip, following which the background was filtered out to exclude the effects of nonrelative factors. A residual neural network was constructed to classify cancer and noncancer cases. Transfer learning, minibatches, few epochs, L2 regulation, and random dropout training strategies were used to prevent overfitting. Moreover, guided gradient-weighted class activation mapping was used to reveal the relevant features. RESULTS: A total of 8124 face images of patients with cancer (men: n=3851, 47.4%; women: n=4273, 52.6%) were collected from January 2018 to January 2019. The ages of the patients ranged from 1 year to 70 years (median age 52 years). The average faces of both male and female patients with cancer displayed more obvious facial adiposity than the average faces of people without cancer, which was supported by a landmark comparison. When testing the data set, the training process was terminated after 5 epochs. The area under the receiver operating characteristic curve was 0.94, and the accuracy rate was 0.82. The main relative feature of cancer cases was facial skin, while the relative features of noncancer cases were extracted from the complementary face region. CONCLUSIONS: In this study, we built a face data set of patients with cancer and constructed a deep learning model to classify the faces of people with and those without cancer. We found that facial skin and adiposity were closely related to the presence of cancer.
format Online
Article
Text
id pubmed-7221634
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-72216342020-05-18 Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study Liang, Bin Yang, Na He, Guosheng Huang, Peng Yang, Yong J Med Internet Res Original Paper BACKGROUND: Cancer has become the second leading cause of death globally. Most cancer cases are due to genetic mutations, which affect metabolism and result in facial changes. OBJECTIVE: In this study, we aimed to identify the facial features of patients with cancer using the deep learning technique. METHODS: Images of faces of patients with cancer were collected to build the cancer face image data set. A face image data set of people without cancer was built by randomly selecting images from the publicly available MegaAge data set according to the sex and age distribution of the cancer face image data set. Each face image was preprocessed to obtain an upright centered face chip, following which the background was filtered out to exclude the effects of nonrelative factors. A residual neural network was constructed to classify cancer and noncancer cases. Transfer learning, minibatches, few epochs, L2 regulation, and random dropout training strategies were used to prevent overfitting. Moreover, guided gradient-weighted class activation mapping was used to reveal the relevant features. RESULTS: A total of 8124 face images of patients with cancer (men: n=3851, 47.4%; women: n=4273, 52.6%) were collected from January 2018 to January 2019. The ages of the patients ranged from 1 year to 70 years (median age 52 years). The average faces of both male and female patients with cancer displayed more obvious facial adiposity than the average faces of people without cancer, which was supported by a landmark comparison. When testing the data set, the training process was terminated after 5 epochs. The area under the receiver operating characteristic curve was 0.94, and the accuracy rate was 0.82. The main relative feature of cancer cases was facial skin, while the relative features of noncancer cases were extracted from the complementary face region. CONCLUSIONS: In this study, we built a face data set of patients with cancer and constructed a deep learning model to classify the faces of people with and those without cancer. We found that facial skin and adiposity were closely related to the presence of cancer. JMIR Publications 2020-04-29 /pmc/articles/PMC7221634/ /pubmed/32347802 http://dx.doi.org/10.2196/17234 Text en ©Bin Liang, Na Yang, Guosheng He, Peng Huang, Yong Yang. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.04.2020. https://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Liang, Bin
Yang, Na
He, Guosheng
Huang, Peng
Yang, Yong
Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title_full Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title_fullStr Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title_full_unstemmed Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title_short Identification of the Facial Features of Patients With Cancer: A Deep Learning–Based Pilot Study
title_sort identification of the facial features of patients with cancer: a deep learning–based pilot study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7221634/
https://www.ncbi.nlm.nih.gov/pubmed/32347802
http://dx.doi.org/10.2196/17234
work_keys_str_mv AT liangbin identificationofthefacialfeaturesofpatientswithcanceradeeplearningbasedpilotstudy
AT yangna identificationofthefacialfeaturesofpatientswithcanceradeeplearningbasedpilotstudy
AT heguosheng identificationofthefacialfeaturesofpatientswithcanceradeeplearningbasedpilotstudy
AT huangpeng identificationofthefacialfeaturesofpatientswithcanceradeeplearningbasedpilotstudy
AT yangyong identificationofthefacialfeaturesofpatientswithcanceradeeplearningbasedpilotstudy