Cargando…
Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation
In the sparse representation model, the design of overcomplete dictionaries plays a key role for the effectiveness and applicability in different domains. Recent research has produced several dictionary learning approaches, being proven that dictionaries learnt by data examples significantly outperf...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5245881/ https://www.ncbi.nlm.nih.gov/pubmed/28103283 http://dx.doi.org/10.1371/journal.pone.0169663 |
_version_ | 1782496899243180032 |
---|---|
author | Grossi, Giuliano Lanzarotti, Raffaella Lin, Jianyi |
author_facet | Grossi, Giuliano Lanzarotti, Raffaella Lin, Jianyi |
author_sort | Grossi, Giuliano |
collection | PubMed |
description | In the sparse representation model, the design of overcomplete dictionaries plays a key role for the effectiveness and applicability in different domains. Recent research has produced several dictionary learning approaches, being proven that dictionaries learnt by data examples significantly outperform structured ones, e.g. wavelet transforms. In this context, learning consists in adapting the dictionary atoms to a set of training signals in order to promote a sparse representation that minimizes the reconstruction error. Finding the best fitting dictionary remains a very difficult task, leaving the question still open. A well-established heuristic method for tackling this problem is an iterative alternating scheme, adopted for instance in the well-known K-SVD algorithm. Essentially, it consists in repeating two stages; the former promotes sparse coding of the training set and the latter adapts the dictionary to reduce the error. In this paper we present R-SVD, a new method that, while maintaining the alternating scheme, adopts the Orthogonal Procrustes analysis to update the dictionary atoms suitably arranged into groups. Comparative experiments on synthetic data prove the effectiveness of R-SVD with respect to well known dictionary learning algorithms such as K-SVD, ILS-DLA and the online method OSDL. Moreover, experiments on natural data such as ECG compression, EEG sparse representation, and image modeling confirm R-SVD’s robustness and wide applicability. |
format | Online Article Text |
id | pubmed-5245881 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2017 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-52458812017-02-06 Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation Grossi, Giuliano Lanzarotti, Raffaella Lin, Jianyi PLoS One Research Article In the sparse representation model, the design of overcomplete dictionaries plays a key role for the effectiveness and applicability in different domains. Recent research has produced several dictionary learning approaches, being proven that dictionaries learnt by data examples significantly outperform structured ones, e.g. wavelet transforms. In this context, learning consists in adapting the dictionary atoms to a set of training signals in order to promote a sparse representation that minimizes the reconstruction error. Finding the best fitting dictionary remains a very difficult task, leaving the question still open. A well-established heuristic method for tackling this problem is an iterative alternating scheme, adopted for instance in the well-known K-SVD algorithm. Essentially, it consists in repeating two stages; the former promotes sparse coding of the training set and the latter adapts the dictionary to reduce the error. In this paper we present R-SVD, a new method that, while maintaining the alternating scheme, adopts the Orthogonal Procrustes analysis to update the dictionary atoms suitably arranged into groups. Comparative experiments on synthetic data prove the effectiveness of R-SVD with respect to well known dictionary learning algorithms such as K-SVD, ILS-DLA and the online method OSDL. Moreover, experiments on natural data such as ECG compression, EEG sparse representation, and image modeling confirm R-SVD’s robustness and wide applicability. Public Library of Science 2017-01-19 /pmc/articles/PMC5245881/ /pubmed/28103283 http://dx.doi.org/10.1371/journal.pone.0169663 Text en © 2017 Grossi et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Grossi, Giuliano Lanzarotti, Raffaella Lin, Jianyi Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title | Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title_full | Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title_fullStr | Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title_full_unstemmed | Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title_short | Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation |
title_sort | orthogonal procrustes analysis for dictionary learning in sparse linear representation |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5245881/ https://www.ncbi.nlm.nih.gov/pubmed/28103283 http://dx.doi.org/10.1371/journal.pone.0169663 |
work_keys_str_mv | AT grossigiuliano orthogonalprocrustesanalysisfordictionarylearninginsparselinearrepresentation AT lanzarottiraffaella orthogonalprocrustesanalysisfordictionarylearninginsparselinearrepresentation AT linjianyi orthogonalprocrustesanalysisfordictionarylearninginsparselinearrepresentation |