Cargando…

Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction

BACKGROUND: Recurrent neural network(RNN) is a good way to process sequential data, but the capability of RNN to compute long sequence data is inefficient. As a variant of RNN, long short term memory(LSTM) solved the problem in some extent. Here we improved LSTM for big data application in protein-p...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Jiale, Gong, Xinqi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6882172/
https://www.ncbi.nlm.nih.gov/pubmed/31775612
http://dx.doi.org/10.1186/s12859-019-3199-1
_version_ 1783474097351557120
author Liu, Jiale
Gong, Xinqi
author_facet Liu, Jiale
Gong, Xinqi
author_sort Liu, Jiale
collection PubMed
description BACKGROUND: Recurrent neural network(RNN) is a good way to process sequential data, but the capability of RNN to compute long sequence data is inefficient. As a variant of RNN, long short term memory(LSTM) solved the problem in some extent. Here we improved LSTM for big data application in protein-protein interaction interface residue pairs prediction based on the following two reasons. On the one hand, there are some deficiencies in LSTM, such as shallow layers, gradient explosion or vanishing, etc. With a dramatic data increasing, the imbalance between algorithm innovation and big data processing has been more serious and urgent. On the other hand, protein-protein interaction interface residue pairs prediction is an important problem in biology, but the low prediction accuracy compels us to propose new computational methods. RESULTS: In order to surmount aforementioned problems of LSTM, we adopt the residual architecture and add attention mechanism to LSTM. In detail, we redefine the block, and add a connection from front to back in every two layers and attention mechanism to strengthen the capability of mining information. Then we use it to predict protein-protein interaction interface residue pairs, and acquire a quite good accuracy over 72%. What’s more, we compare our method with random experiments, PPiPP, standard LSTM, and some other machine learning methods. Our method shows better performance than the methods mentioned above. CONCLUSION: We present an attention mechanism enhanced LSTM with residual architecture, and make deeper network without gradient vanishing or explosion to a certain extent. Then we apply it to a significant problem– protein-protein interaction interface residue pairs prediction and obtain a better accuracy than other methods. Our method provides a new approach for protein-protein interaction computation, which will be helpful for related biomedical researches.
format Online
Article
Text
id pubmed-6882172
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-68821722019-12-03 Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction Liu, Jiale Gong, Xinqi BMC Bioinformatics Methodology Article BACKGROUND: Recurrent neural network(RNN) is a good way to process sequential data, but the capability of RNN to compute long sequence data is inefficient. As a variant of RNN, long short term memory(LSTM) solved the problem in some extent. Here we improved LSTM for big data application in protein-protein interaction interface residue pairs prediction based on the following two reasons. On the one hand, there are some deficiencies in LSTM, such as shallow layers, gradient explosion or vanishing, etc. With a dramatic data increasing, the imbalance between algorithm innovation and big data processing has been more serious and urgent. On the other hand, protein-protein interaction interface residue pairs prediction is an important problem in biology, but the low prediction accuracy compels us to propose new computational methods. RESULTS: In order to surmount aforementioned problems of LSTM, we adopt the residual architecture and add attention mechanism to LSTM. In detail, we redefine the block, and add a connection from front to back in every two layers and attention mechanism to strengthen the capability of mining information. Then we use it to predict protein-protein interaction interface residue pairs, and acquire a quite good accuracy over 72%. What’s more, we compare our method with random experiments, PPiPP, standard LSTM, and some other machine learning methods. Our method shows better performance than the methods mentioned above. CONCLUSION: We present an attention mechanism enhanced LSTM with residual architecture, and make deeper network without gradient vanishing or explosion to a certain extent. Then we apply it to a significant problem– protein-protein interaction interface residue pairs prediction and obtain a better accuracy than other methods. Our method provides a new approach for protein-protein interaction computation, which will be helpful for related biomedical researches. BioMed Central 2019-11-27 /pmc/articles/PMC6882172/ /pubmed/31775612 http://dx.doi.org/10.1186/s12859-019-3199-1 Text en © The Author(s) 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver(http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Methodology Article
Liu, Jiale
Gong, Xinqi
Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title_full Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title_fullStr Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title_full_unstemmed Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title_short Attention mechanism enhanced LSTM with residual architecture and its application for protein-protein interaction residue pairs prediction
title_sort attention mechanism enhanced lstm with residual architecture and its application for protein-protein interaction residue pairs prediction
topic Methodology Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6882172/
https://www.ncbi.nlm.nih.gov/pubmed/31775612
http://dx.doi.org/10.1186/s12859-019-3199-1
work_keys_str_mv AT liujiale attentionmechanismenhancedlstmwithresidualarchitectureanditsapplicationforproteinproteininteractionresiduepairsprediction
AT gongxinqi attentionmechanismenhancedlstmwithresidualarchitectureanditsapplicationforproteinproteininteractionresiduepairsprediction