Cargando…

A greedy regression algorithm with coarse weights offers novel advantages

Regularized regression analysis is a mature analytic approach to identify weighted sums of variables predicting outcomes. We present a novel Coarse Approximation Linear Function (CALF) to frugally select important predictors and build simple but powerful predictive models. CALF is a linear regressio...

Descripción completa

Detalles Bibliográficos
Autores principales: Jeffries, Clark D., Ford, John R., Tilson, Jeffrey L., Perkins, Diana O., Bost, Darius M., Filer, Dayne L., Wilhelmsen, Kirk C.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8971398/
https://www.ncbi.nlm.nih.gov/pubmed/35361850
http://dx.doi.org/10.1038/s41598-022-09415-2
_version_ 1784679624191508480
author Jeffries, Clark D.
Ford, John R.
Tilson, Jeffrey L.
Perkins, Diana O.
Bost, Darius M.
Filer, Dayne L.
Wilhelmsen, Kirk C.
author_facet Jeffries, Clark D.
Ford, John R.
Tilson, Jeffrey L.
Perkins, Diana O.
Bost, Darius M.
Filer, Dayne L.
Wilhelmsen, Kirk C.
author_sort Jeffries, Clark D.
collection PubMed
description Regularized regression analysis is a mature analytic approach to identify weighted sums of variables predicting outcomes. We present a novel Coarse Approximation Linear Function (CALF) to frugally select important predictors and build simple but powerful predictive models. CALF is a linear regression strategy applied to normalized data that uses nonzero weights + 1 or − 1. Qualitative (linearly invariant) metrics to be optimized can be (for binary response) Welch (Student) t-test p-value or area under curve (AUC) of receiver operating characteristic, or (for real response) Pearson correlation. Predictor weighting is critically important when developing risk prediction models. While counterintuitive, it is a fact that qualitative metrics can favor CALF with ± 1 weights over algorithms producing real number weights. Moreover, while regression methods may be expected to change most or all weight values upon even small changes in input data (e.g., discarding a single subject of hundreds) CALF weights generally do not so change. Similarly, some regression methods applied to collinear or nearly collinear variables yield unpredictable magnitude or the direction (in p-space) of the weights as a vector. In contrast, with CALF if some predictors are linearly dependent or nearly so, CALF simply chooses at most one (the most informative, if any) and ignores the others, thus avoiding the inclusion of two or more collinear variables in the model.
format Online
Article
Text
id pubmed-8971398
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-89713982022-04-01 A greedy regression algorithm with coarse weights offers novel advantages Jeffries, Clark D. Ford, John R. Tilson, Jeffrey L. Perkins, Diana O. Bost, Darius M. Filer, Dayne L. Wilhelmsen, Kirk C. Sci Rep Article Regularized regression analysis is a mature analytic approach to identify weighted sums of variables predicting outcomes. We present a novel Coarse Approximation Linear Function (CALF) to frugally select important predictors and build simple but powerful predictive models. CALF is a linear regression strategy applied to normalized data that uses nonzero weights + 1 or − 1. Qualitative (linearly invariant) metrics to be optimized can be (for binary response) Welch (Student) t-test p-value or area under curve (AUC) of receiver operating characteristic, or (for real response) Pearson correlation. Predictor weighting is critically important when developing risk prediction models. While counterintuitive, it is a fact that qualitative metrics can favor CALF with ± 1 weights over algorithms producing real number weights. Moreover, while regression methods may be expected to change most or all weight values upon even small changes in input data (e.g., discarding a single subject of hundreds) CALF weights generally do not so change. Similarly, some regression methods applied to collinear or nearly collinear variables yield unpredictable magnitude or the direction (in p-space) of the weights as a vector. In contrast, with CALF if some predictors are linearly dependent or nearly so, CALF simply chooses at most one (the most informative, if any) and ignores the others, thus avoiding the inclusion of two or more collinear variables in the model. Nature Publishing Group UK 2022-03-31 /pmc/articles/PMC8971398/ /pubmed/35361850 http://dx.doi.org/10.1038/s41598-022-09415-2 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Jeffries, Clark D.
Ford, John R.
Tilson, Jeffrey L.
Perkins, Diana O.
Bost, Darius M.
Filer, Dayne L.
Wilhelmsen, Kirk C.
A greedy regression algorithm with coarse weights offers novel advantages
title A greedy regression algorithm with coarse weights offers novel advantages
title_full A greedy regression algorithm with coarse weights offers novel advantages
title_fullStr A greedy regression algorithm with coarse weights offers novel advantages
title_full_unstemmed A greedy regression algorithm with coarse weights offers novel advantages
title_short A greedy regression algorithm with coarse weights offers novel advantages
title_sort greedy regression algorithm with coarse weights offers novel advantages
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8971398/
https://www.ncbi.nlm.nih.gov/pubmed/35361850
http://dx.doi.org/10.1038/s41598-022-09415-2
work_keys_str_mv AT jeffriesclarkd agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT fordjohnr agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT tilsonjeffreyl agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT perkinsdianao agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT bostdariusm agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT filerdaynel agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT wilhelmsenkirkc agreedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT jeffriesclarkd greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT fordjohnr greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT tilsonjeffreyl greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT perkinsdianao greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT bostdariusm greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT filerdaynel greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages
AT wilhelmsenkirkc greedyregressionalgorithmwithcoarseweightsoffersnoveladvantages