Cargando…
Distilling experience into a physically interpretable recommender system for computational model selection
Model selection is a chronic issue in computational science. The conventional approach relies heavily on human experience. However, gaining experience takes years and is severely inefficient. To address this issue, we distill human experience into a recommender system. A trained recommender system t...
Autores principales: | Huang, Xinyi, Chyczewski, Thomas, Xia, Zhenhua, Kunz, Robert, Yang, Xiang |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9908871/ https://www.ncbi.nlm.nih.gov/pubmed/36755115 http://dx.doi.org/10.1038/s41598-023-27426-5 |
Ejemplares similares
-
TNT: An Interpretable Tree-Network-Tree Learning Framework using Knowledge Distillation
por: Li, Jiawei, et al.
Publicado: (2020) -
Interpretable pairwise distillations for generative protein sequence models
por: Feinauer, Christoph, et al.
Publicado: (2022) -
Distilling identifiable and interpretable dynamic models from biological data
por: Massonis, Gemma, et al.
Publicado: (2023) -
Knowledge distillation for multi-depth-model-fusion recommendation algorithm
por: Yang, Mingbao, et al.
Publicado: (2022) -
Feature Selection for Recommender Systems with Quantum Computing
por: Nembrini, Riccardo, et al.
Publicado: (2021)