Cargando…
Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution
In the real-world scenario, data often have a long-tailed distribution and training deep neural networks on such an imbalanced dataset has become a great challenge. The main problem caused by a long-tailed data distribution is that common classes will dominate the training results and achieve a very...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8723848/ https://www.ncbi.nlm.nih.gov/pubmed/34987568 http://dx.doi.org/10.1155/2021/6702625 |
_version_ | 1784625808297426944 |
---|---|
author | Hu, Hao Gao, Mengya Wu, Mingsheng |
author_facet | Hu, Hao Gao, Mengya Wu, Mingsheng |
author_sort | Hu, Hao |
collection | PubMed |
description | In the real-world scenario, data often have a long-tailed distribution and training deep neural networks on such an imbalanced dataset has become a great challenge. The main problem caused by a long-tailed data distribution is that common classes will dominate the training results and achieve a very low accuracy on the rare classes. Recent work focuses on improving the network representation ability to overcome the long-tailed problem, while it always ignores adapting the network classifier to a long-tailed case, which will cause the “incompatibility” problem of network representation and network classifier. In this paper, we use knowledge distillation to solve the long-tailed data distribution problem and fully optimize the network representation and classifier simultaneously. We propose multiexperts knowledge distillation with class-balanced sampling to jointly learn high-quality network representation and classifier. Also, a channel activation-based knowledge distillation method is also proposed to improve the performance further. State-of-the-art performance on several large-scale long-tailed classification datasets shows the superior generalization of our method. |
format | Online Article Text |
id | pubmed-8723848 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-87238482022-01-04 Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution Hu, Hao Gao, Mengya Wu, Mingsheng Comput Intell Neurosci Research Article In the real-world scenario, data often have a long-tailed distribution and training deep neural networks on such an imbalanced dataset has become a great challenge. The main problem caused by a long-tailed data distribution is that common classes will dominate the training results and achieve a very low accuracy on the rare classes. Recent work focuses on improving the network representation ability to overcome the long-tailed problem, while it always ignores adapting the network classifier to a long-tailed case, which will cause the “incompatibility” problem of network representation and network classifier. In this paper, we use knowledge distillation to solve the long-tailed data distribution problem and fully optimize the network representation and classifier simultaneously. We propose multiexperts knowledge distillation with class-balanced sampling to jointly learn high-quality network representation and classifier. Also, a channel activation-based knowledge distillation method is also proposed to improve the performance further. State-of-the-art performance on several large-scale long-tailed classification datasets shows the superior generalization of our method. Hindawi 2021-12-27 /pmc/articles/PMC8723848/ /pubmed/34987568 http://dx.doi.org/10.1155/2021/6702625 Text en Copyright © 2021 Hao Hu et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Hu, Hao Gao, Mengya Wu, Mingsheng Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title | Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title_full | Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title_fullStr | Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title_full_unstemmed | Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title_short | Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution |
title_sort | relieving the incompatibility of network representation and classification for long-tailed data distribution |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8723848/ https://www.ncbi.nlm.nih.gov/pubmed/34987568 http://dx.doi.org/10.1155/2021/6702625 |
work_keys_str_mv | AT huhao relievingtheincompatibilityofnetworkrepresentationandclassificationforlongtaileddatadistribution AT gaomengya relievingtheincompatibilityofnetworkrepresentationandclassificationforlongtaileddatadistribution AT wumingsheng relievingtheincompatibilityofnetworkrepresentationandclassificationforlongtaileddatadistribution |