Cargando…

Low-rank robust online distance/similarity learning based on the rescaled hinge loss

An important challenge in metric learning is scalability to both size and dimension of input data. Online metric learning algorithms are proposed to address this challenge. Existing methods are commonly based on Passive/Aggressive (PA) approach. Hence, they can rapidly process large volumes of data...

Descripción completa

Detalles Bibliográficos
Autores principales: Zabihzadeh, Davood, Tuama, Amar, Karami-Mollaee, Ali, Mousavirad, Seyed Jalaleddin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9020766/
https://www.ncbi.nlm.nih.gov/pubmed/35469120
http://dx.doi.org/10.1007/s10489-022-03419-1
_version_ 1784689632952188928
author Zabihzadeh, Davood
Tuama, Amar
Karami-Mollaee, Ali
Mousavirad, Seyed Jalaleddin
author_facet Zabihzadeh, Davood
Tuama, Amar
Karami-Mollaee, Ali
Mousavirad, Seyed Jalaleddin
author_sort Zabihzadeh, Davood
collection PubMed
description An important challenge in metric learning is scalability to both size and dimension of input data. Online metric learning algorithms are proposed to address this challenge. Existing methods are commonly based on Passive/Aggressive (PA) approach. Hence, they can rapidly process large volumes of data with an adaptive learning rate. However, these algorithms are based on the Hinge loss and so are not robust against outliers and label noise. We address the challenges by formulating the online Distance/Similarity learning problem with the robust Rescaled Hinge loss function. The proposed model is rather general and can be applied to any PA-based online Distance/Similarity algorithm. To achieve scalability to data dimension, we propose low-rank online Distance/Similarity methods that learn a rectangular projection matrix instead of a full Mahalanobis matrix. The low-rank approaches not only reduce the computational cost but also keep the discrimination power of the learned metrics. Also, current online methods usually assume training triplets or pairwise constraints exist in advance. However, this assumption does not hold, and generating triplets using available batch sampling methods is both time and space consuming. We address this issue by developing an efficient, yet effective robust one-pass triplet construction algorithm. We conduct several experiments on datasets from various applications. The results confirm that the proposed methods significantly outperform state-of-the-art online metric learning methods in the presence of label noise and outliers by a large margin.
format Online
Article
Text
id pubmed-9020766
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-90207662022-04-21 Low-rank robust online distance/similarity learning based on the rescaled hinge loss Zabihzadeh, Davood Tuama, Amar Karami-Mollaee, Ali Mousavirad, Seyed Jalaleddin Appl Intell (Dordr) Article An important challenge in metric learning is scalability to both size and dimension of input data. Online metric learning algorithms are proposed to address this challenge. Existing methods are commonly based on Passive/Aggressive (PA) approach. Hence, they can rapidly process large volumes of data with an adaptive learning rate. However, these algorithms are based on the Hinge loss and so are not robust against outliers and label noise. We address the challenges by formulating the online Distance/Similarity learning problem with the robust Rescaled Hinge loss function. The proposed model is rather general and can be applied to any PA-based online Distance/Similarity algorithm. To achieve scalability to data dimension, we propose low-rank online Distance/Similarity methods that learn a rectangular projection matrix instead of a full Mahalanobis matrix. The low-rank approaches not only reduce the computational cost but also keep the discrimination power of the learned metrics. Also, current online methods usually assume training triplets or pairwise constraints exist in advance. However, this assumption does not hold, and generating triplets using available batch sampling methods is both time and space consuming. We address this issue by developing an efficient, yet effective robust one-pass triplet construction algorithm. We conduct several experiments on datasets from various applications. The results confirm that the proposed methods significantly outperform state-of-the-art online metric learning methods in the presence of label noise and outliers by a large margin. Springer US 2022-04-20 2023 /pmc/articles/PMC9020766/ /pubmed/35469120 http://dx.doi.org/10.1007/s10489-022-03419-1 Text en © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Zabihzadeh, Davood
Tuama, Amar
Karami-Mollaee, Ali
Mousavirad, Seyed Jalaleddin
Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title_full Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title_fullStr Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title_full_unstemmed Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title_short Low-rank robust online distance/similarity learning based on the rescaled hinge loss
title_sort low-rank robust online distance/similarity learning based on the rescaled hinge loss
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9020766/
https://www.ncbi.nlm.nih.gov/pubmed/35469120
http://dx.doi.org/10.1007/s10489-022-03419-1
work_keys_str_mv AT zabihzadehdavood lowrankrobustonlinedistancesimilaritylearningbasedontherescaledhingeloss
AT tuamaamar lowrankrobustonlinedistancesimilaritylearningbasedontherescaledhingeloss
AT karamimollaeeali lowrankrobustonlinedistancesimilaritylearningbasedontherescaledhingeloss
AT mousaviradseyedjalaleddin lowrankrobustonlinedistancesimilaritylearningbasedontherescaledhingeloss