Cargando…
Memory augmented recurrent neural networks for de-novo drug design
A recurrent neural network (RNN) is a machine learning model that learns the relationship between elements of an input series, in addition to inferring a relationship between the data input to the model and target output. Memory augmentation allows the RNN to learn the interrelationships between ele...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9223405/ https://www.ncbi.nlm.nih.gov/pubmed/35737661 http://dx.doi.org/10.1371/journal.pone.0269461 |
_version_ | 1784733118043783168 |
---|---|
author | Suresh, Naveen Chinnakonda Ashok Kumar, Neelesh Subramanian, Srikumar Srinivasa, Gowri |
author_facet | Suresh, Naveen Chinnakonda Ashok Kumar, Neelesh Subramanian, Srikumar Srinivasa, Gowri |
author_sort | Suresh, Naveen |
collection | PubMed |
description | A recurrent neural network (RNN) is a machine learning model that learns the relationship between elements of an input series, in addition to inferring a relationship between the data input to the model and target output. Memory augmentation allows the RNN to learn the interrelationships between elements of the input over a protracted length of the input series. Inspired by the success of stack augmented RNN (StackRNN) to generate strings for various applications, we present two memory augmented RNN-based architectures: the Neural Turing Machine (NTM) and the Differentiable Neural Computer (DNC) for the de-novo generation of small molecules. We trained a character-level convolutional neural network (CNN) to predict the properties of a generated string and compute a reward or loss in a deep reinforcement learning setup to bias the Generator to produce molecules with the desired property. Further, we compare the performance of these architectures to gain insight to their relative merits in terms of the validity and novelty of the generated molecules and the degree of property bias towards the computational generation of de-novo drugs. We also compare the performance of these architectures with simpler recurrent neural networks (Vanilla RNN, LSTM, and GRU) without an external memory component to explore the impact of augmented memory in the task of de-novo generation of small molecules. |
format | Online Article Text |
id | pubmed-9223405 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-92234052022-06-24 Memory augmented recurrent neural networks for de-novo drug design Suresh, Naveen Chinnakonda Ashok Kumar, Neelesh Subramanian, Srikumar Srinivasa, Gowri PLoS One Research Article A recurrent neural network (RNN) is a machine learning model that learns the relationship between elements of an input series, in addition to inferring a relationship between the data input to the model and target output. Memory augmentation allows the RNN to learn the interrelationships between elements of the input over a protracted length of the input series. Inspired by the success of stack augmented RNN (StackRNN) to generate strings for various applications, we present two memory augmented RNN-based architectures: the Neural Turing Machine (NTM) and the Differentiable Neural Computer (DNC) for the de-novo generation of small molecules. We trained a character-level convolutional neural network (CNN) to predict the properties of a generated string and compute a reward or loss in a deep reinforcement learning setup to bias the Generator to produce molecules with the desired property. Further, we compare the performance of these architectures to gain insight to their relative merits in terms of the validity and novelty of the generated molecules and the degree of property bias towards the computational generation of de-novo drugs. We also compare the performance of these architectures with simpler recurrent neural networks (Vanilla RNN, LSTM, and GRU) without an external memory component to explore the impact of augmented memory in the task of de-novo generation of small molecules. Public Library of Science 2022-06-23 /pmc/articles/PMC9223405/ /pubmed/35737661 http://dx.doi.org/10.1371/journal.pone.0269461 Text en © 2022 Suresh et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Suresh, Naveen Chinnakonda Ashok Kumar, Neelesh Subramanian, Srikumar Srinivasa, Gowri Memory augmented recurrent neural networks for de-novo drug design |
title | Memory augmented recurrent neural networks for de-novo drug design |
title_full | Memory augmented recurrent neural networks for de-novo drug design |
title_fullStr | Memory augmented recurrent neural networks for de-novo drug design |
title_full_unstemmed | Memory augmented recurrent neural networks for de-novo drug design |
title_short | Memory augmented recurrent neural networks for de-novo drug design |
title_sort | memory augmented recurrent neural networks for de-novo drug design |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9223405/ https://www.ncbi.nlm.nih.gov/pubmed/35737661 http://dx.doi.org/10.1371/journal.pone.0269461 |
work_keys_str_mv | AT sureshnaveen memoryaugmentedrecurrentneuralnetworksfordenovodrugdesign AT chinnakondaashokkumarneelesh memoryaugmentedrecurrentneuralnetworksfordenovodrugdesign AT subramaniansrikumar memoryaugmentedrecurrentneuralnetworksfordenovodrugdesign AT srinivasagowri memoryaugmentedrecurrentneuralnetworksfordenovodrugdesign |