Cargando…
Explaining a series of models by propagating Shapley values
Local feature attribution methods are increasingly used to explain complex machine learning models. However, current methods are limited because they are extremely expensive to compute or are not capable of explaining a distributed series of models where each model is owned by a separate institution...
Autores principales: | Chen, Hugh, Lundberg, Scott M., Lee, Su-In |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9349278/ https://www.ncbi.nlm.nih.gov/pubmed/35922410 http://dx.doi.org/10.1038/s41467-022-31384-3 |
Ejemplares similares
-
The Shapley value: essays in honor of Lloyd S. Shapley
por: Roth, Alvin E
Publicado: (1988) -
Explaining multivariate molecular diagnostic tests via Shapley values
por: Roder, Joanna, et al.
Publicado: (2021) -
Explaining Multiclass Compound Activity Predictions Using Counterfactuals and Shapley Values
por: Lamens, Alec, et al.
Publicado: (2023) -
The Shapley value of regression portfolios
por: Shalit, Haim
Publicado: (2020) -
Calculation of exact Shapley values for explaining support vector machine models using the radial basis function kernel
por: Mastropietro, Andrea, et al.
Publicado: (2023)