Cargando…
State-of-the-art augmented NLP transformer models for direct and single-step retrosynthesis
We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using text-like representation of chemical reactions (SMILES) and Natural Language Processing (NLP) neural network Transformer architecture. We showed that data augmentation, which is...
Autores principales: | Tetko, Igor V., Karpov, Pavel, Van Deursen, Ruud, Godin, Guillaume |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7643129/ https://www.ncbi.nlm.nih.gov/pubmed/33149154 http://dx.doi.org/10.1038/s41467-020-19266-y |
Ejemplares similares
-
Transformer-CNN: Swiss knife for QSAR modeling and interpretation
por: Karpov, Pavel, et al.
Publicado: (2020) -
GEN: highly efficient SMILES explorer using autodidactic generative examination networks
por: van Deursen, Ruud, et al.
Publicado: (2020) -
Single-step retrosynthesis prediction by leveraging commonly preserved substructures
por: Fang, Lei, et al.
Publicado: (2023) -
Enhancing diversity in language based models for single-step retrosynthesis
por: Toniato, Alessandra, et al.
Publicado: (2023) -
Improving the performance of models for one-step retrosynthesis through re-ranking
por: Lin, Min Htoo, et al.
Publicado: (2022)