Cargando…

Unbiased K-L estimator for the linear regression model

Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflat...

Descripción completa

Detalles Bibliográficos
Autores principales: Aladeitan, BENEDICTA, Lukman, Adewale F, Davids, Esther, Oranye, Ebele H, Kibria, Golam B M
Formato: Online Artículo Texto
Lenguaje:English
Publicado: F1000 Research Limited 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8825663/
https://www.ncbi.nlm.nih.gov/pubmed/35186270
http://dx.doi.org/10.12688/f1000research.54990.1
_version_ 1784647270610763776
author Aladeitan, BENEDICTA
Lukman, Adewale F
Davids, Esther
Oranye, Ebele H
Kibria, Golam B M
author_facet Aladeitan, BENEDICTA
Lukman, Adewale F
Davids, Esther
Oranye, Ebele H
Kibria, Golam B M
author_sort Aladeitan, BENEDICTA
collection PubMed
description Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods: In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators.  Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model.
format Online
Article
Text
id pubmed-8825663
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher F1000 Research Limited
record_format MEDLINE/PubMed
spelling pubmed-88256632022-02-17 Unbiased K-L estimator for the linear regression model Aladeitan, BENEDICTA Lukman, Adewale F Davids, Esther Oranye, Ebele H Kibria, Golam B M F1000Res Research Article Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods: In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators.  Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model. F1000 Research Limited 2021-08-19 /pmc/articles/PMC8825663/ /pubmed/35186270 http://dx.doi.org/10.12688/f1000research.54990.1 Text en Copyright: © 2021 Aladeitan B et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Aladeitan, BENEDICTA
Lukman, Adewale F
Davids, Esther
Oranye, Ebele H
Kibria, Golam B M
Unbiased K-L estimator for the linear regression model
title Unbiased K-L estimator for the linear regression model
title_full Unbiased K-L estimator for the linear regression model
title_fullStr Unbiased K-L estimator for the linear regression model
title_full_unstemmed Unbiased K-L estimator for the linear regression model
title_short Unbiased K-L estimator for the linear regression model
title_sort unbiased k-l estimator for the linear regression model
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8825663/
https://www.ncbi.nlm.nih.gov/pubmed/35186270
http://dx.doi.org/10.12688/f1000research.54990.1
work_keys_str_mv AT aladeitanbenedicta unbiasedklestimatorforthelinearregressionmodel
AT lukmanadewalef unbiasedklestimatorforthelinearregressionmodel
AT davidsesther unbiasedklestimatorforthelinearregressionmodel
AT oranyeebeleh unbiasedklestimatorforthelinearregressionmodel
AT kibriagolambm unbiasedklestimatorforthelinearregressionmodel