Cargando…

Precision oncology: a review to assess interpretability in several explainable methods

Great efforts have been made to develop precision medicine-based treatments using machine learning. In this field, where the goal is to provide the optimal treatment for each patient based on his/her medical history and genomic characteristics, it is not sufficient to make excellent predictions. The...

Descripción completa

Detalles Bibliográficos
Autores principales: Gimeno, Marian, Sada del Real, Katyna, Rubio, Angel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10359088/
https://www.ncbi.nlm.nih.gov/pubmed/37253690
http://dx.doi.org/10.1093/bib/bbad200
_version_ 1785075804443508736
author Gimeno, Marian
Sada del Real, Katyna
Rubio, Angel
author_facet Gimeno, Marian
Sada del Real, Katyna
Rubio, Angel
author_sort Gimeno, Marian
collection PubMed
description Great efforts have been made to develop precision medicine-based treatments using machine learning. In this field, where the goal is to provide the optimal treatment for each patient based on his/her medical history and genomic characteristics, it is not sufficient to make excellent predictions. The challenge is to understand and trust the model’s decisions while also being able to easily implement it. However, one of the issues with machine learning algorithms—particularly deep learning—is their lack of interpretability. This review compares six different machine learning methods to provide guidance for defining interpretability by focusing on accuracy, multi-omics capability, explainability and implementability. Our selection of algorithms includes tree-, regression- and kernel-based methods, which we selected for their ease of interpretation for the clinician. We also included two novel explainable methods in the comparison. No significant differences in accuracy were observed when comparing the methods, but an improvement was observed when using gene expression instead of mutational status as input for these methods. We concentrated on the current intriguing challenge: model comprehension and ease of use. Our comparison suggests that the tree-based methods are the most interpretable of those tested.
format Online
Article
Text
id pubmed-10359088
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-103590882023-07-21 Precision oncology: a review to assess interpretability in several explainable methods Gimeno, Marian Sada del Real, Katyna Rubio, Angel Brief Bioinform Review Great efforts have been made to develop precision medicine-based treatments using machine learning. In this field, where the goal is to provide the optimal treatment for each patient based on his/her medical history and genomic characteristics, it is not sufficient to make excellent predictions. The challenge is to understand and trust the model’s decisions while also being able to easily implement it. However, one of the issues with machine learning algorithms—particularly deep learning—is their lack of interpretability. This review compares six different machine learning methods to provide guidance for defining interpretability by focusing on accuracy, multi-omics capability, explainability and implementability. Our selection of algorithms includes tree-, regression- and kernel-based methods, which we selected for their ease of interpretation for the clinician. We also included two novel explainable methods in the comparison. No significant differences in accuracy were observed when comparing the methods, but an improvement was observed when using gene expression instead of mutational status as input for these methods. We concentrated on the current intriguing challenge: model comprehension and ease of use. Our comparison suggests that the tree-based methods are the most interpretable of those tested. Oxford University Press 2023-05-30 /pmc/articles/PMC10359088/ /pubmed/37253690 http://dx.doi.org/10.1093/bib/bbad200 Text en © The Author(s) 2023. Published by Oxford University Press. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Review
Gimeno, Marian
Sada del Real, Katyna
Rubio, Angel
Precision oncology: a review to assess interpretability in several explainable methods
title Precision oncology: a review to assess interpretability in several explainable methods
title_full Precision oncology: a review to assess interpretability in several explainable methods
title_fullStr Precision oncology: a review to assess interpretability in several explainable methods
title_full_unstemmed Precision oncology: a review to assess interpretability in several explainable methods
title_short Precision oncology: a review to assess interpretability in several explainable methods
title_sort precision oncology: a review to assess interpretability in several explainable methods
topic Review
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10359088/
https://www.ncbi.nlm.nih.gov/pubmed/37253690
http://dx.doi.org/10.1093/bib/bbad200
work_keys_str_mv AT gimenomarian precisiononcologyareviewtoassessinterpretabilityinseveralexplainablemethods
AT sadadelrealkatyna precisiononcologyareviewtoassessinterpretabilityinseveralexplainablemethods
AT rubioangel precisiononcologyareviewtoassessinterpretabilityinseveralexplainablemethods