Cargando…
Fractional ridge regression: a fast, interpretable reparameterization of ridge regression
BACKGROUND: Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to se...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7702219/ https://www.ncbi.nlm.nih.gov/pubmed/33252656 http://dx.doi.org/10.1093/gigascience/giaa133 |
_version_ | 1783616570194395136 |
---|---|
author | Rokem, Ariel Kay, Kendrick |
author_facet | Rokem, Ariel Kay, Kendrick |
author_sort | Rokem, Ariel |
collection | PubMed |
description | BACKGROUND: Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. RESULTS: The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. CONCLUSION: Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets. |
format | Online Article Text |
id | pubmed-7702219 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Oxford University Press |
record_format | MEDLINE/PubMed |
spelling | pubmed-77022192020-12-07 Fractional ridge regression: a fast, interpretable reparameterization of ridge regression Rokem, Ariel Kay, Kendrick Gigascience Technical Note BACKGROUND: Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. RESULTS: The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. CONCLUSION: Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets. Oxford University Press 2020-11-30 /pmc/articles/PMC7702219/ /pubmed/33252656 http://dx.doi.org/10.1093/gigascience/giaa133 Text en © The Author(s) 2020. Published by Oxford University Press GigaScience. http://creativecommons.org/licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Technical Note Rokem, Ariel Kay, Kendrick Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title | Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title_full | Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title_fullStr | Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title_full_unstemmed | Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title_short | Fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
title_sort | fractional ridge regression: a fast, interpretable reparameterization of ridge regression |
topic | Technical Note |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7702219/ https://www.ncbi.nlm.nih.gov/pubmed/33252656 http://dx.doi.org/10.1093/gigascience/giaa133 |
work_keys_str_mv | AT rokemariel fractionalridgeregressionafastinterpretablereparameterizationofridgeregression AT kaykendrick fractionalridgeregressionafastinterpretablereparameterizationofridgeregression |