Cargando…
Regularized impurity reduction: accurate decision trees with complexity guarantees
Decision trees are popular classification models, providing high accuracy and intuitive explanations. However, as the tree size grows the model interpretability deteriorates. Traditional tree-induction algorithms, such as C4.5 and CART, rely on impurity-reduction functions that promote the discrimin...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9813065/ https://www.ncbi.nlm.nih.gov/pubmed/36618773 http://dx.doi.org/10.1007/s10618-022-00884-7 |
_version_ | 1784863851358978048 |
---|---|
author | Zhang, Guangyi Gionis, Aristides |
author_facet | Zhang, Guangyi Gionis, Aristides |
author_sort | Zhang, Guangyi |
collection | PubMed |
description | Decision trees are popular classification models, providing high accuracy and intuitive explanations. However, as the tree size grows the model interpretability deteriorates. Traditional tree-induction algorithms, such as C4.5 and CART, rely on impurity-reduction functions that promote the discriminative power of each split. Thus, although these traditional methods are accurate in practice, there has been no theoretical guarantee that they will produce small trees. In this paper, we justify the use of a general family of impurity functions, including the popular functions of entropy and Gini-index, in scenarios where small trees are desirable, by showing that a simple enhancement can equip them with complexity guarantees. We consider a general setting, where objects to be classified are drawn from an arbitrary probability distribution, classification can be binary or multi-class, and splitting tests are associated with non-uniform costs. As a measure of tree complexity, we adopt the expected cost to classify an object drawn from the input distribution, which, in the uniform-cost case, is the expected number of tests. We propose a tree-induction algorithm that gives a logarithmic approximation guarantee on the tree complexity. This approximation factor is tight up to a constant factor under mild assumptions. The algorithm recursively selects a test that maximizes a greedy criterion defined as a weighted sum of three components. The first two components encourage the selection of tests that improve the balance and the cost-efficiency of the tree, respectively, while the third impurity-reduction component encourages the selection of more discriminative tests. As shown in our empirical evaluation, compared to the original heuristics, the enhanced algorithms strike an excellent balance between predictive accuracy and tree complexity. |
format | Online Article Text |
id | pubmed-9813065 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer US |
record_format | MEDLINE/PubMed |
spelling | pubmed-98130652023-01-06 Regularized impurity reduction: accurate decision trees with complexity guarantees Zhang, Guangyi Gionis, Aristides Data Min Knowl Discov Article Decision trees are popular classification models, providing high accuracy and intuitive explanations. However, as the tree size grows the model interpretability deteriorates. Traditional tree-induction algorithms, such as C4.5 and CART, rely on impurity-reduction functions that promote the discriminative power of each split. Thus, although these traditional methods are accurate in practice, there has been no theoretical guarantee that they will produce small trees. In this paper, we justify the use of a general family of impurity functions, including the popular functions of entropy and Gini-index, in scenarios where small trees are desirable, by showing that a simple enhancement can equip them with complexity guarantees. We consider a general setting, where objects to be classified are drawn from an arbitrary probability distribution, classification can be binary or multi-class, and splitting tests are associated with non-uniform costs. As a measure of tree complexity, we adopt the expected cost to classify an object drawn from the input distribution, which, in the uniform-cost case, is the expected number of tests. We propose a tree-induction algorithm that gives a logarithmic approximation guarantee on the tree complexity. This approximation factor is tight up to a constant factor under mild assumptions. The algorithm recursively selects a test that maximizes a greedy criterion defined as a weighted sum of three components. The first two components encourage the selection of tests that improve the balance and the cost-efficiency of the tree, respectively, while the third impurity-reduction component encourages the selection of more discriminative tests. As shown in our empirical evaluation, compared to the original heuristics, the enhanced algorithms strike an excellent balance between predictive accuracy and tree complexity. Springer US 2022-11-28 2023 /pmc/articles/PMC9813065/ /pubmed/36618773 http://dx.doi.org/10.1007/s10618-022-00884-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Zhang, Guangyi Gionis, Aristides Regularized impurity reduction: accurate decision trees with complexity guarantees |
title | Regularized impurity reduction: accurate decision trees with complexity guarantees |
title_full | Regularized impurity reduction: accurate decision trees with complexity guarantees |
title_fullStr | Regularized impurity reduction: accurate decision trees with complexity guarantees |
title_full_unstemmed | Regularized impurity reduction: accurate decision trees with complexity guarantees |
title_short | Regularized impurity reduction: accurate decision trees with complexity guarantees |
title_sort | regularized impurity reduction: accurate decision trees with complexity guarantees |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9813065/ https://www.ncbi.nlm.nih.gov/pubmed/36618773 http://dx.doi.org/10.1007/s10618-022-00884-7 |
work_keys_str_mv | AT zhangguangyi regularizedimpurityreductionaccuratedecisiontreeswithcomplexityguarantees AT gionisaristides regularizedimpurityreductionaccuratedecisiontreeswithcomplexityguarantees |