Cargando…
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network
Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8623800/ https://www.ncbi.nlm.nih.gov/pubmed/34833717 http://dx.doi.org/10.3390/s21227640 |
_version_ | 1784606019311108096 |
---|---|
author | Park, Changhyun Lee, Hean Sung Kim, Woo Jin Bae, Han Byeol Lee, Jaeho Lee, Sangyoun |
author_facet | Park, Changhyun Lee, Hean Sung Kim, Woo Jin Bae, Han Byeol Lee, Jaeho Lee, Sangyoun |
author_sort | Park, Changhyun |
collection | PubMed |
description | Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently studied, limitations still exist in balancing these two aspects. In this paper, a novel knowledge distilled lightweight top-down pose network (KDLPN) is proposed that balances computational complexity and accuracy. For the first time in multi-person pose estimation, a network that reduces computational complexity by applying a “Pelee” structure and shuffles pixels in the dense upsampling convolution layer to reduce the number of channels is presented. Furthermore, to prevent performance degradation because of the reduced computational complexity, knowledge distillation is applied to establish the pose estimation network as a teacher network. The method performance is evaluated on the MSCOCO dataset. Experimental results demonstrate that our KDLPN network significantly reduces 95% of the parameters required by state-of-the-art methods with minimal performance degradation. Moreover, our method is compared with other pose estimation methods to substantiate the importance of computational complexity reduction and its effectiveness. |
format | Online Article Text |
id | pubmed-8623800 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-86238002021-11-27 An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network Park, Changhyun Lee, Hean Sung Kim, Woo Jin Bae, Han Byeol Lee, Jaeho Lee, Sangyoun Sensors (Basel) Article Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently studied, limitations still exist in balancing these two aspects. In this paper, a novel knowledge distilled lightweight top-down pose network (KDLPN) is proposed that balances computational complexity and accuracy. For the first time in multi-person pose estimation, a network that reduces computational complexity by applying a “Pelee” structure and shuffles pixels in the dense upsampling convolution layer to reduce the number of channels is presented. Furthermore, to prevent performance degradation because of the reduced computational complexity, knowledge distillation is applied to establish the pose estimation network as a teacher network. The method performance is evaluated on the MSCOCO dataset. Experimental results demonstrate that our KDLPN network significantly reduces 95% of the parameters required by state-of-the-art methods with minimal performance degradation. Moreover, our method is compared with other pose estimation methods to substantiate the importance of computational complexity reduction and its effectiveness. MDPI 2021-11-17 /pmc/articles/PMC8623800/ /pubmed/34833717 http://dx.doi.org/10.3390/s21227640 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Park, Changhyun Lee, Hean Sung Kim, Woo Jin Bae, Han Byeol Lee, Jaeho Lee, Sangyoun An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title | An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_full | An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_fullStr | An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_full_unstemmed | An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_short | An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_sort | efficient approach using knowledge distillation methods to stabilize performance in a lightweight top-down posture estimation network |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8623800/ https://www.ncbi.nlm.nih.gov/pubmed/34833717 http://dx.doi.org/10.3390/s21227640 |
work_keys_str_mv | AT parkchanghyun anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leeheansung anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT kimwoojin anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT baehanbyeol anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leejaeho anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leesangyoun anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT parkchanghyun efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leeheansung efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT kimwoojin efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT baehanbyeol efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leejaeho efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT leesangyoun efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork |