Cargando…
Deep Classification with Linearity-Enhanced Logits to Softmax Function
Recently, there has been a rapid increase in deep classification tasks, such as image recognition and target detection. As one of the most crucial components in Convolutional Neural Network (CNN) architectures, softmax arguably encourages CNN to achieve better performance in image recognition. Under...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10217205/ https://www.ncbi.nlm.nih.gov/pubmed/37238482 http://dx.doi.org/10.3390/e25050727 |
_version_ | 1785048481003470848 |
---|---|
author | Shao, Hao Wang, Shunfang |
author_facet | Shao, Hao Wang, Shunfang |
author_sort | Shao, Hao |
collection | PubMed |
description | Recently, there has been a rapid increase in deep classification tasks, such as image recognition and target detection. As one of the most crucial components in Convolutional Neural Network (CNN) architectures, softmax arguably encourages CNN to achieve better performance in image recognition. Under this scheme, we present a conceptually intuitive learning objection function: Orthogonal-Softmax. The primary property of the loss function is to use a linear approximation model that is designed by Gram–Schmidt orthogonalization. Firstly, compared with the traditional softmax and Taylor-Softmax, Orthogonal-Softmax has a stronger relationship through orthogonal polynomials expansion. Secondly, a new loss function is advanced to acquire highly discriminative features for classification tasks. At last, we present a linear softmax loss to further promote the intra-class compactness and inter-class discrepancy simultaneously. The results of the widespread experimental discussion on four benchmark datasets manifest the validity of the presented method. Besides, we want to explore the non-ground truth samples in the future. |
format | Online Article Text |
id | pubmed-10217205 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-102172052023-05-27 Deep Classification with Linearity-Enhanced Logits to Softmax Function Shao, Hao Wang, Shunfang Entropy (Basel) Article Recently, there has been a rapid increase in deep classification tasks, such as image recognition and target detection. As one of the most crucial components in Convolutional Neural Network (CNN) architectures, softmax arguably encourages CNN to achieve better performance in image recognition. Under this scheme, we present a conceptually intuitive learning objection function: Orthogonal-Softmax. The primary property of the loss function is to use a linear approximation model that is designed by Gram–Schmidt orthogonalization. Firstly, compared with the traditional softmax and Taylor-Softmax, Orthogonal-Softmax has a stronger relationship through orthogonal polynomials expansion. Secondly, a new loss function is advanced to acquire highly discriminative features for classification tasks. At last, we present a linear softmax loss to further promote the intra-class compactness and inter-class discrepancy simultaneously. The results of the widespread experimental discussion on four benchmark datasets manifest the validity of the presented method. Besides, we want to explore the non-ground truth samples in the future. MDPI 2023-04-27 /pmc/articles/PMC10217205/ /pubmed/37238482 http://dx.doi.org/10.3390/e25050727 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Shao, Hao Wang, Shunfang Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title | Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title_full | Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title_fullStr | Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title_full_unstemmed | Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title_short | Deep Classification with Linearity-Enhanced Logits to Softmax Function |
title_sort | deep classification with linearity-enhanced logits to softmax function |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10217205/ https://www.ncbi.nlm.nih.gov/pubmed/37238482 http://dx.doi.org/10.3390/e25050727 |
work_keys_str_mv | AT shaohao deepclassificationwithlinearityenhancedlogitstosoftmaxfunction AT wangshunfang deepclassificationwithlinearityenhancedlogitstosoftmaxfunction |