Cargando…

Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition

Micro-expression recognition (MER) is challenging due to the difficulty of capturing the instantaneous and subtle motion changes of micro-expressions (MEs). Early works based on hand-crafted features extracted from prior knowledge showed some promising results, but have recently been replaced by dee...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhou, Haoliang, Huang, Shucheng, Li, Jingting, Wang, Su-Jing
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10048169/
https://www.ncbi.nlm.nih.gov/pubmed/36981348
http://dx.doi.org/10.3390/e25030460
_version_ 1785014113071529984
author Zhou, Haoliang
Huang, Shucheng
Li, Jingting
Wang, Su-Jing
author_facet Zhou, Haoliang
Huang, Shucheng
Li, Jingting
Wang, Su-Jing
author_sort Zhou, Haoliang
collection PubMed
description Micro-expression recognition (MER) is challenging due to the difficulty of capturing the instantaneous and subtle motion changes of micro-expressions (MEs). Early works based on hand-crafted features extracted from prior knowledge showed some promising results, but have recently been replaced by deep learning methods based on the attention mechanism. However, with limited ME sample sizes, features extracted by these methods lack discriminative ME representations, in yet-to-be improved MER performance. This paper proposes the Dual-branch Attention Network (Dual-ATME) for MER to address the problem of ineffective single-scale features representing MEs. Specifically, Dual-ATME consists of two components: Hand-crafted Attention Region Selection (HARS) and Automated Attention Region Selection (AARS). HARS uses prior knowledge to manually extract features from regions of interest (ROIs). Meanwhile, AARS is based on attention mechanisms and extracts hidden information from data automatically. Finally, through similarity comparison and feature fusion, the dual-scale features could be used to learn ME representations effectively. Experiments on spontaneous ME datasets (including CASME II, SAMM, SMIC) and their composite dataset, MEGC2019-CD, showed that Dual-ATME achieves better, or more competitive, performance than the state-of-the-art MER methods.
format Online
Article
Text
id pubmed-10048169
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100481692023-03-29 Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition Zhou, Haoliang Huang, Shucheng Li, Jingting Wang, Su-Jing Entropy (Basel) Article Micro-expression recognition (MER) is challenging due to the difficulty of capturing the instantaneous and subtle motion changes of micro-expressions (MEs). Early works based on hand-crafted features extracted from prior knowledge showed some promising results, but have recently been replaced by deep learning methods based on the attention mechanism. However, with limited ME sample sizes, features extracted by these methods lack discriminative ME representations, in yet-to-be improved MER performance. This paper proposes the Dual-branch Attention Network (Dual-ATME) for MER to address the problem of ineffective single-scale features representing MEs. Specifically, Dual-ATME consists of two components: Hand-crafted Attention Region Selection (HARS) and Automated Attention Region Selection (AARS). HARS uses prior knowledge to manually extract features from regions of interest (ROIs). Meanwhile, AARS is based on attention mechanisms and extracts hidden information from data automatically. Finally, through similarity comparison and feature fusion, the dual-scale features could be used to learn ME representations effectively. Experiments on spontaneous ME datasets (including CASME II, SAMM, SMIC) and their composite dataset, MEGC2019-CD, showed that Dual-ATME achieves better, or more competitive, performance than the state-of-the-art MER methods. MDPI 2023-03-06 /pmc/articles/PMC10048169/ /pubmed/36981348 http://dx.doi.org/10.3390/e25030460 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhou, Haoliang
Huang, Shucheng
Li, Jingting
Wang, Su-Jing
Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title_full Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title_fullStr Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title_full_unstemmed Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title_short Dual-ATME: Dual-Branch Attention Network for Micro-Expression Recognition
title_sort dual-atme: dual-branch attention network for micro-expression recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10048169/
https://www.ncbi.nlm.nih.gov/pubmed/36981348
http://dx.doi.org/10.3390/e25030460
work_keys_str_mv AT zhouhaoliang dualatmedualbranchattentionnetworkformicroexpressionrecognition
AT huangshucheng dualatmedualbranchattentionnetworkformicroexpressionrecognition
AT lijingting dualatmedualbranchattentionnetworkformicroexpressionrecognition
AT wangsujing dualatmedualbranchattentionnetworkformicroexpressionrecognition