Cargando…
Transformer-CNN: Swiss knife for QSAR modeling and interpretation
We present SMILES-embeddings derived from the internal encoder state of a Transformer [1] model trained to canonize SMILES as a Seq2Seq problem. Using a CharNN [2] architecture upon the embeddings results in higher quality interpretable QSAR/QSPR models on diverse benchmark datasets including regres...
Autores principales: | Karpov, Pavel, Godin, Guillaume, Tetko, Igor V. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer International Publishing
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7079452/ https://www.ncbi.nlm.nih.gov/pubmed/33431004 http://dx.doi.org/10.1186/s13321-020-00423-w |
Ejemplares similares
-
State-of-the-art augmented NLP transformer models for direct and single-step retrosynthesis
por: Tetko, Igor V., et al.
Publicado: (2020) -
Benchmarks for interpretation of QSAR models
por: Matveieva, Mariia, et al.
Publicado: (2021) -
Prediction-driven matched molecular pairs to interpret QSARs and aid the molecular optimization process
por: Sushko, Yurii, et al.
Publicado: (2014) -
TURBO: The Swiss Knife of Auto-Encoders
por: Quétant, Guillaume, et al.
Publicado: (2023) -
QSAR modeling for In vitro assays: linking ToxCast™ database to the integrated modeling framework “OCHEM”
por: Abdelaziz, Ahmed, et al.
Publicado: (2012)