Cargando…

Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data

We sought to verify the reliability of machine learning (ML) in developing diabetes prediction models by utilizing big data. To this end, we compared the reliability of gradient boosting decision tree (GBDT) and logistic regression (LR) models using data obtained from the Kokuho-database of the Osak...

Descripción completa

Detalles Bibliográficos
Autores principales: Seto, Hiroe, Oyama, Asuka, Kitora, Shuji, Toki, Hiroshi, Yamamoto, Ryohei, Kotoku, Jun’ichi, Haga, Akihiro, Shinzawa, Maki, Yamakawa, Miyae, Fukui, Sakiko, Moriyama, Toshiki
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9553945/
https://www.ncbi.nlm.nih.gov/pubmed/36220875
http://dx.doi.org/10.1038/s41598-022-20149-z
_version_ 1784806587621179392
author Seto, Hiroe
Oyama, Asuka
Kitora, Shuji
Toki, Hiroshi
Yamamoto, Ryohei
Kotoku, Jun’ichi
Haga, Akihiro
Shinzawa, Maki
Yamakawa, Miyae
Fukui, Sakiko
Moriyama, Toshiki
author_facet Seto, Hiroe
Oyama, Asuka
Kitora, Shuji
Toki, Hiroshi
Yamamoto, Ryohei
Kotoku, Jun’ichi
Haga, Akihiro
Shinzawa, Maki
Yamakawa, Miyae
Fukui, Sakiko
Moriyama, Toshiki
author_sort Seto, Hiroe
collection PubMed
description We sought to verify the reliability of machine learning (ML) in developing diabetes prediction models by utilizing big data. To this end, we compared the reliability of gradient boosting decision tree (GBDT) and logistic regression (LR) models using data obtained from the Kokuho-database of the Osaka prefecture, Japan. To develop the models, we focused on 16 predictors from health checkup data from April 2013 to December 2014. A total of 277,651 eligible participants were studied. The prediction models were developed using a light gradient boosting machine (LightGBM), which is an effective GBDT implementation algorithm, and LR. Their reliabilities were measured based on expected calibration error (ECE), negative log-likelihood (Logloss), and reliability diagrams. Similarly, their classification accuracies were measured in the area under the curve (AUC). We further analyzed their reliabilities while changing the sample size for training. Among the 277,651 participants, 15,900 (7978 males and 7922 females) were newly diagnosed with diabetes within 3 years. LightGBM (LR) achieved an ECE of 0.0018 ± 0.00033 (0.0048 ± 0.00058), a Logloss of 0.167 ± 0.00062 (0.172 ± 0.00090), and an AUC of 0.844 ± 0.0025 (0.826 ± 0.0035). From sample size analysis, the reliability of LightGBM became higher than LR when the sample size increased more than [Formula: see text] . Thus, we confirmed that GBDT provides a more reliable model than that of LR in the development of diabetes prediction models using big data. ML could potentially produce a highly reliable diabetes prediction model, a helpful tool for improving lifestyle and preventing diabetes.
format Online
Article
Text
id pubmed-9553945
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-95539452022-10-13 Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data Seto, Hiroe Oyama, Asuka Kitora, Shuji Toki, Hiroshi Yamamoto, Ryohei Kotoku, Jun’ichi Haga, Akihiro Shinzawa, Maki Yamakawa, Miyae Fukui, Sakiko Moriyama, Toshiki Sci Rep Article We sought to verify the reliability of machine learning (ML) in developing diabetes prediction models by utilizing big data. To this end, we compared the reliability of gradient boosting decision tree (GBDT) and logistic regression (LR) models using data obtained from the Kokuho-database of the Osaka prefecture, Japan. To develop the models, we focused on 16 predictors from health checkup data from April 2013 to December 2014. A total of 277,651 eligible participants were studied. The prediction models were developed using a light gradient boosting machine (LightGBM), which is an effective GBDT implementation algorithm, and LR. Their reliabilities were measured based on expected calibration error (ECE), negative log-likelihood (Logloss), and reliability diagrams. Similarly, their classification accuracies were measured in the area under the curve (AUC). We further analyzed their reliabilities while changing the sample size for training. Among the 277,651 participants, 15,900 (7978 males and 7922 females) were newly diagnosed with diabetes within 3 years. LightGBM (LR) achieved an ECE of 0.0018 ± 0.00033 (0.0048 ± 0.00058), a Logloss of 0.167 ± 0.00062 (0.172 ± 0.00090), and an AUC of 0.844 ± 0.0025 (0.826 ± 0.0035). From sample size analysis, the reliability of LightGBM became higher than LR when the sample size increased more than [Formula: see text] . Thus, we confirmed that GBDT provides a more reliable model than that of LR in the development of diabetes prediction models using big data. ML could potentially produce a highly reliable diabetes prediction model, a helpful tool for improving lifestyle and preventing diabetes. Nature Publishing Group UK 2022-10-11 /pmc/articles/PMC9553945/ /pubmed/36220875 http://dx.doi.org/10.1038/s41598-022-20149-z Text en © The Author(s) 2022, corrected publication 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Seto, Hiroe
Oyama, Asuka
Kitora, Shuji
Toki, Hiroshi
Yamamoto, Ryohei
Kotoku, Jun’ichi
Haga, Akihiro
Shinzawa, Maki
Yamakawa, Miyae
Fukui, Sakiko
Moriyama, Toshiki
Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title_full Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title_fullStr Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title_full_unstemmed Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title_short Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
title_sort gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9553945/
https://www.ncbi.nlm.nih.gov/pubmed/36220875
http://dx.doi.org/10.1038/s41598-022-20149-z
work_keys_str_mv AT setohiroe gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT oyamaasuka gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT kitorashuji gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT tokihiroshi gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT yamamotoryohei gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT kotokujunichi gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT hagaakihiro gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT shinzawamaki gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT yamakawamiyae gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT fukuisakiko gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata
AT moriyamatoshiki gradientboostingdecisiontreebecomesmorereliablethanlogisticregressioninpredictingprobabilityfordiabeteswithbigdata