Cargando…
Regression shrinkage and selection via least quantile shrinkage and selection operator
Over recent years, the state-of-the-art lasso and adaptive lasso have aquired remarkable consideration. Unlike the lasso technique, adaptive lasso welcomes the variables’ effects in penalty meanwhile specifying adaptive weights to penalize coefficients in a different manner. However, if the initial...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9934385/ https://www.ncbi.nlm.nih.gov/pubmed/36795659 http://dx.doi.org/10.1371/journal.pone.0266267 |
_version_ | 1784889873866424320 |
---|---|
author | Daneshvar, Alireza Mousa, Golalizadeh |
author_facet | Daneshvar, Alireza Mousa, Golalizadeh |
author_sort | Daneshvar, Alireza |
collection | PubMed |
description | Over recent years, the state-of-the-art lasso and adaptive lasso have aquired remarkable consideration. Unlike the lasso technique, adaptive lasso welcomes the variables’ effects in penalty meanwhile specifying adaptive weights to penalize coefficients in a different manner. However, if the initial values presumed for the coefficients are less than one, the corresponding weights would be relatively large, leading to an increase in bias. To dominate such an impediment, a new class of weighted lasso will be introduced that employs all aspects of data. That is to say, signs and magnitudes of the initial coefficients will be taken into account simultaneously for proposing appropriate weights. To allocate a particular form to the suggested penalty, the new method will be nominated as ‘lqsso’, standing for the least quantile shrinkage and selection operator. In this paper, we demonstate that lqsso encompasses the oracle properties under certain mild conditions and delineate an efficient algorithm for the computation purpose. Simulation studies reveal the predominance of our proposed methodology when compared with other lasso methods from various aspects, particularly in ultra high-dimensional condition. Application of the proposed method is further underlined with real-world problem based on the rat eye dataset. |
format | Online Article Text |
id | pubmed-9934385 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-99343852023-02-17 Regression shrinkage and selection via least quantile shrinkage and selection operator Daneshvar, Alireza Mousa, Golalizadeh PLoS One Research Article Over recent years, the state-of-the-art lasso and adaptive lasso have aquired remarkable consideration. Unlike the lasso technique, adaptive lasso welcomes the variables’ effects in penalty meanwhile specifying adaptive weights to penalize coefficients in a different manner. However, if the initial values presumed for the coefficients are less than one, the corresponding weights would be relatively large, leading to an increase in bias. To dominate such an impediment, a new class of weighted lasso will be introduced that employs all aspects of data. That is to say, signs and magnitudes of the initial coefficients will be taken into account simultaneously for proposing appropriate weights. To allocate a particular form to the suggested penalty, the new method will be nominated as ‘lqsso’, standing for the least quantile shrinkage and selection operator. In this paper, we demonstate that lqsso encompasses the oracle properties under certain mild conditions and delineate an efficient algorithm for the computation purpose. Simulation studies reveal the predominance of our proposed methodology when compared with other lasso methods from various aspects, particularly in ultra high-dimensional condition. Application of the proposed method is further underlined with real-world problem based on the rat eye dataset. Public Library of Science 2023-02-16 /pmc/articles/PMC9934385/ /pubmed/36795659 http://dx.doi.org/10.1371/journal.pone.0266267 Text en © 2023 Daneshvar, Golalizadeh https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Daneshvar, Alireza Mousa, Golalizadeh Regression shrinkage and selection via least quantile shrinkage and selection operator |
title | Regression shrinkage and selection via least quantile shrinkage and selection operator |
title_full | Regression shrinkage and selection via least quantile shrinkage and selection operator |
title_fullStr | Regression shrinkage and selection via least quantile shrinkage and selection operator |
title_full_unstemmed | Regression shrinkage and selection via least quantile shrinkage and selection operator |
title_short | Regression shrinkage and selection via least quantile shrinkage and selection operator |
title_sort | regression shrinkage and selection via least quantile shrinkage and selection operator |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9934385/ https://www.ncbi.nlm.nih.gov/pubmed/36795659 http://dx.doi.org/10.1371/journal.pone.0266267 |
work_keys_str_mv | AT daneshvaralireza regressionshrinkageandselectionvialeastquantileshrinkageandselectionoperator AT mousagolalizadeh regressionshrinkageandselectionvialeastquantileshrinkageandselectionoperator |