Cargando…

Self-organizing maps on “what-where” codes towards fully unsupervised classification

Interest in unsupervised learning architectures has been rising. Besides being biologically unnatural, it is costly to depend on large labeled data sets to get a well-performing classification system. Therefore, both the deep learning community and the more biologically-inspired models community hav...

Descripción completa

Detalles Bibliográficos
Autores principales: Sa-Couto, Luis, Wichert, Andreas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Berlin Heidelberg 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10258173/
https://www.ncbi.nlm.nih.gov/pubmed/37188974
http://dx.doi.org/10.1007/s00422-023-00963-y
_version_ 1785057423542714368
author Sa-Couto, Luis
Wichert, Andreas
author_facet Sa-Couto, Luis
Wichert, Andreas
author_sort Sa-Couto, Luis
collection PubMed
description Interest in unsupervised learning architectures has been rising. Besides being biologically unnatural, it is costly to depend on large labeled data sets to get a well-performing classification system. Therefore, both the deep learning community and the more biologically-inspired models community have focused on proposing unsupervised techniques that can produce adequate hidden representations which can then be fed to a simpler supervised classifier. Despite great success with this approach, an ultimate dependence on a supervised model remains, which forces the number of classes to be known beforehand, and makes the system depend on labels to extract concepts. To overcome this limitation, recent work has been proposed that shows how a self-organizing map (SOM) can be used as a completely unsupervised classifier. However, to achieve success it required deep learning techniques to generate high quality embeddings. The purpose of this work is to show that we can use our previously proposed What-Where encoder in tandem with the SOM to get an end-to-end unsupervised system that is Hebbian. Such system, requires no labels to train nor does it require knowledge of which classes exist beforehand. It can be trained online and adapt to new classes that may emerge. As in the original work, we use the MNIST data set to run an experimental analysis and verify that the system achieves similar accuracies to the best ones reported thus far. Furthermore, we extend the analysis to the more difficult Fashion-MNIST problem and conclude that the system still performs.
format Online
Article
Text
id pubmed-10258173
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer Berlin Heidelberg
record_format MEDLINE/PubMed
spelling pubmed-102581732023-06-13 Self-organizing maps on “what-where” codes towards fully unsupervised classification Sa-Couto, Luis Wichert, Andreas Biol Cybern Original Article Interest in unsupervised learning architectures has been rising. Besides being biologically unnatural, it is costly to depend on large labeled data sets to get a well-performing classification system. Therefore, both the deep learning community and the more biologically-inspired models community have focused on proposing unsupervised techniques that can produce adequate hidden representations which can then be fed to a simpler supervised classifier. Despite great success with this approach, an ultimate dependence on a supervised model remains, which forces the number of classes to be known beforehand, and makes the system depend on labels to extract concepts. To overcome this limitation, recent work has been proposed that shows how a self-organizing map (SOM) can be used as a completely unsupervised classifier. However, to achieve success it required deep learning techniques to generate high quality embeddings. The purpose of this work is to show that we can use our previously proposed What-Where encoder in tandem with the SOM to get an end-to-end unsupervised system that is Hebbian. Such system, requires no labels to train nor does it require knowledge of which classes exist beforehand. It can be trained online and adapt to new classes that may emerge. As in the original work, we use the MNIST data set to run an experimental analysis and verify that the system achieves similar accuracies to the best ones reported thus far. Furthermore, we extend the analysis to the more difficult Fashion-MNIST problem and conclude that the system still performs. Springer Berlin Heidelberg 2023-05-15 2023 /pmc/articles/PMC10258173/ /pubmed/37188974 http://dx.doi.org/10.1007/s00422-023-00963-y Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
Sa-Couto, Luis
Wichert, Andreas
Self-organizing maps on “what-where” codes towards fully unsupervised classification
title Self-organizing maps on “what-where” codes towards fully unsupervised classification
title_full Self-organizing maps on “what-where” codes towards fully unsupervised classification
title_fullStr Self-organizing maps on “what-where” codes towards fully unsupervised classification
title_full_unstemmed Self-organizing maps on “what-where” codes towards fully unsupervised classification
title_short Self-organizing maps on “what-where” codes towards fully unsupervised classification
title_sort self-organizing maps on “what-where” codes towards fully unsupervised classification
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10258173/
https://www.ncbi.nlm.nih.gov/pubmed/37188974
http://dx.doi.org/10.1007/s00422-023-00963-y
work_keys_str_mv AT sacoutoluis selforganizingmapsonwhatwherecodestowardsfullyunsupervisedclassification
AT wichertandreas selforganizingmapsonwhatwherecodestowardsfullyunsupervisedclassification