Cargando…
Graph–sequence attention and transformer for predicting drug–target affinity
Drug–target binding affinity (DTA) prediction has drawn increasing interest due to its substantial position in the drug discovery process. The development of new drugs is costly, time-consuming, and often accompanied by safety issues. Drug repurposing can avoid the expensive and lengthy process of d...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
The Royal Society of Chemistry
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9562047/ https://www.ncbi.nlm.nih.gov/pubmed/36320763 http://dx.doi.org/10.1039/d2ra05566j |
_version_ | 1784808084442447872 |
---|---|
author | Yan, Xiangfeng Liu, Yong |
author_facet | Yan, Xiangfeng Liu, Yong |
author_sort | Yan, Xiangfeng |
collection | PubMed |
description | Drug–target binding affinity (DTA) prediction has drawn increasing interest due to its substantial position in the drug discovery process. The development of new drugs is costly, time-consuming, and often accompanied by safety issues. Drug repurposing can avoid the expensive and lengthy process of drug development by finding new uses for already approved drugs. Therefore, it is of great significance to develop effective computational methods to predict DTAs. The attention mechanisms allow the computational method to focus on the most relevant parts of the input and have been proven to be useful for various tasks. In this study, we proposed a novel model based on self-attention, called GSATDTA, to predict the binding affinity between drugs and targets. For the representation of drugs, we use Bi-directional Gated Recurrent Units (BiGRU) to extract the SMILES representation from SMILES sequences, and graph neural networks to extract the graph representation of the molecular graphs. Then we utilize an attention mechanism to fuse the two representations of the drug. For the target/protein, we utilized an efficient transformer to learn the representation of the protein, which can capture the long-distance relationships in the sequence of amino acids. We conduct extensive experiments to compare our model with state-of-the-art models. Experimental results show that our model outperforms the current state-of-the-art methods on two independent datasets. |
format | Online Article Text |
id | pubmed-9562047 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | The Royal Society of Chemistry |
record_format | MEDLINE/PubMed |
spelling | pubmed-95620472022-10-31 Graph–sequence attention and transformer for predicting drug–target affinity Yan, Xiangfeng Liu, Yong RSC Adv Chemistry Drug–target binding affinity (DTA) prediction has drawn increasing interest due to its substantial position in the drug discovery process. The development of new drugs is costly, time-consuming, and often accompanied by safety issues. Drug repurposing can avoid the expensive and lengthy process of drug development by finding new uses for already approved drugs. Therefore, it is of great significance to develop effective computational methods to predict DTAs. The attention mechanisms allow the computational method to focus on the most relevant parts of the input and have been proven to be useful for various tasks. In this study, we proposed a novel model based on self-attention, called GSATDTA, to predict the binding affinity between drugs and targets. For the representation of drugs, we use Bi-directional Gated Recurrent Units (BiGRU) to extract the SMILES representation from SMILES sequences, and graph neural networks to extract the graph representation of the molecular graphs. Then we utilize an attention mechanism to fuse the two representations of the drug. For the target/protein, we utilized an efficient transformer to learn the representation of the protein, which can capture the long-distance relationships in the sequence of amino acids. We conduct extensive experiments to compare our model with state-of-the-art models. Experimental results show that our model outperforms the current state-of-the-art methods on two independent datasets. The Royal Society of Chemistry 2022-10-14 /pmc/articles/PMC9562047/ /pubmed/36320763 http://dx.doi.org/10.1039/d2ra05566j Text en This journal is © The Royal Society of Chemistry https://creativecommons.org/licenses/by-nc/3.0/ |
spellingShingle | Chemistry Yan, Xiangfeng Liu, Yong Graph–sequence attention and transformer for predicting drug–target affinity |
title | Graph–sequence attention and transformer for predicting drug–target affinity |
title_full | Graph–sequence attention and transformer for predicting drug–target affinity |
title_fullStr | Graph–sequence attention and transformer for predicting drug–target affinity |
title_full_unstemmed | Graph–sequence attention and transformer for predicting drug–target affinity |
title_short | Graph–sequence attention and transformer for predicting drug–target affinity |
title_sort | graph–sequence attention and transformer for predicting drug–target affinity |
topic | Chemistry |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9562047/ https://www.ncbi.nlm.nih.gov/pubmed/36320763 http://dx.doi.org/10.1039/d2ra05566j |
work_keys_str_mv | AT yanxiangfeng graphsequenceattentionandtransformerforpredictingdrugtargetaffinity AT liuyong graphsequenceattentionandtransformerforpredictingdrugtargetaffinity |