Cargando…
Unsupervised Few-Shot Feature Learning via Self-Supervised Training
Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community. However, current few-shot learners are mostly supervised and rely heavily on a large amount of labeled examples. Unsupervised learning is a mor...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7592391/ https://www.ncbi.nlm.nih.gov/pubmed/33178000 http://dx.doi.org/10.3389/fncom.2020.00083 |
_version_ | 1783601174388146176 |
---|---|
author | Ji, Zilong Zou, Xiaolong Huang, Tiejun Wu, Si |
author_facet | Ji, Zilong Zou, Xiaolong Huang, Tiejun Wu, Si |
author_sort | Ji, Zilong |
collection | PubMed |
description | Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community. However, current few-shot learners are mostly supervised and rely heavily on a large amount of labeled examples. Unsupervised learning is a more natural procedure for cognitive mammals and has produced promising results in many machine learning tasks. In this paper, we propose an unsupervised feature learning method for few-shot learning. The proposed model consists of two alternate processes, progressive clustering and episodic training. The former generates pseudo-labeled training examples for constructing episodic tasks; and the later trains the few-shot learner using the generated episodic tasks which further optimizes the feature representations of data. The two processes facilitate each other, and eventually produce a high quality few-shot learner. In our experiments, our model achieves good generalization performance in a variety of downstream few-shot learning tasks on Omniglot and MiniImageNet. We also construct a new few-shot person re-identification dataset FS-Market1501 to demonstrate the feasibility of our model to a real-world application. |
format | Online Article Text |
id | pubmed-7592391 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-75923912020-11-10 Unsupervised Few-Shot Feature Learning via Self-Supervised Training Ji, Zilong Zou, Xiaolong Huang, Tiejun Wu, Si Front Comput Neurosci Neuroscience Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community. However, current few-shot learners are mostly supervised and rely heavily on a large amount of labeled examples. Unsupervised learning is a more natural procedure for cognitive mammals and has produced promising results in many machine learning tasks. In this paper, we propose an unsupervised feature learning method for few-shot learning. The proposed model consists of two alternate processes, progressive clustering and episodic training. The former generates pseudo-labeled training examples for constructing episodic tasks; and the later trains the few-shot learner using the generated episodic tasks which further optimizes the feature representations of data. The two processes facilitate each other, and eventually produce a high quality few-shot learner. In our experiments, our model achieves good generalization performance in a variety of downstream few-shot learning tasks on Omniglot and MiniImageNet. We also construct a new few-shot person re-identification dataset FS-Market1501 to demonstrate the feasibility of our model to a real-world application. Frontiers Media S.A. 2020-10-14 /pmc/articles/PMC7592391/ /pubmed/33178000 http://dx.doi.org/10.3389/fncom.2020.00083 Text en Copyright © 2020 Ji, Zou, Huang and Wu. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Ji, Zilong Zou, Xiaolong Huang, Tiejun Wu, Si Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title | Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title_full | Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title_fullStr | Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title_full_unstemmed | Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title_short | Unsupervised Few-Shot Feature Learning via Self-Supervised Training |
title_sort | unsupervised few-shot feature learning via self-supervised training |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7592391/ https://www.ncbi.nlm.nih.gov/pubmed/33178000 http://dx.doi.org/10.3389/fncom.2020.00083 |
work_keys_str_mv | AT jizilong unsupervisedfewshotfeaturelearningviaselfsupervisedtraining AT zouxiaolong unsupervisedfewshotfeaturelearningviaselfsupervisedtraining AT huangtiejun unsupervisedfewshotfeaturelearningviaselfsupervisedtraining AT wusi unsupervisedfewshotfeaturelearningviaselfsupervisedtraining |