Cargando…

S-Pred: protein structural property prediction using MSA transformer

Predicting the local structural features of a protein from its amino acid sequence helps its function prediction to be revealed and assists in three-dimensional structural modeling. As the sequence-structure gap increases, prediction methods have been developed to bridge this gap. Additionally, as t...

Descripción completa

Detalles Bibliográficos
Autores principales: Hong, Yiyu, Song, Jinung, Ko, Junsu, Lee, Juyong, Shin, Woong-Hee
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9381718/
https://www.ncbi.nlm.nih.gov/pubmed/35974061
http://dx.doi.org/10.1038/s41598-022-18205-9
_version_ 1784769135856582656
author Hong, Yiyu
Song, Jinung
Ko, Junsu
Lee, Juyong
Shin, Woong-Hee
author_facet Hong, Yiyu
Song, Jinung
Ko, Junsu
Lee, Juyong
Shin, Woong-Hee
author_sort Hong, Yiyu
collection PubMed
description Predicting the local structural features of a protein from its amino acid sequence helps its function prediction to be revealed and assists in three-dimensional structural modeling. As the sequence-structure gap increases, prediction methods have been developed to bridge this gap. Additionally, as the size of the structural database and computing power increase, the performance of these methods have also significantly improved. Herein, we present a powerful new tool called S-Pred, which can predict eight-state secondary structures (SS8), accessible surface areas (ASAs), and intrinsically disordered regions (IDRs) from a given sequence. For feature prediction, S-Pred uses multiple sequence alignment (MSA) of a query sequence as an input. The MSA input is converted to features by the MSA Transformer, which is a protein language model that uses an attention mechanism. A long short-term memory (LSTM) was employed to produce the final prediction. The performance of S-Pred was evaluated on several test sets, and the program consistently provided accurate predictions. The accuracy of the SS8 prediction was approximately 76%, and the Pearson’s correlation between the experimental and predicted ASAs was 0.84. Additionally, an IDR could be accurately predicted with an F1-score of 0.514. The program is freely available at https://github.com/arontier/S_Pred_Paper and https://ad3.io as a code and a web server.
format Online
Article
Text
id pubmed-9381718
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-93817182022-08-18 S-Pred: protein structural property prediction using MSA transformer Hong, Yiyu Song, Jinung Ko, Junsu Lee, Juyong Shin, Woong-Hee Sci Rep Article Predicting the local structural features of a protein from its amino acid sequence helps its function prediction to be revealed and assists in three-dimensional structural modeling. As the sequence-structure gap increases, prediction methods have been developed to bridge this gap. Additionally, as the size of the structural database and computing power increase, the performance of these methods have also significantly improved. Herein, we present a powerful new tool called S-Pred, which can predict eight-state secondary structures (SS8), accessible surface areas (ASAs), and intrinsically disordered regions (IDRs) from a given sequence. For feature prediction, S-Pred uses multiple sequence alignment (MSA) of a query sequence as an input. The MSA input is converted to features by the MSA Transformer, which is a protein language model that uses an attention mechanism. A long short-term memory (LSTM) was employed to produce the final prediction. The performance of S-Pred was evaluated on several test sets, and the program consistently provided accurate predictions. The accuracy of the SS8 prediction was approximately 76%, and the Pearson’s correlation between the experimental and predicted ASAs was 0.84. Additionally, an IDR could be accurately predicted with an F1-score of 0.514. The program is freely available at https://github.com/arontier/S_Pred_Paper and https://ad3.io as a code and a web server. Nature Publishing Group UK 2022-08-16 /pmc/articles/PMC9381718/ /pubmed/35974061 http://dx.doi.org/10.1038/s41598-022-18205-9 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Hong, Yiyu
Song, Jinung
Ko, Junsu
Lee, Juyong
Shin, Woong-Hee
S-Pred: protein structural property prediction using MSA transformer
title S-Pred: protein structural property prediction using MSA transformer
title_full S-Pred: protein structural property prediction using MSA transformer
title_fullStr S-Pred: protein structural property prediction using MSA transformer
title_full_unstemmed S-Pred: protein structural property prediction using MSA transformer
title_short S-Pred: protein structural property prediction using MSA transformer
title_sort s-pred: protein structural property prediction using msa transformer
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9381718/
https://www.ncbi.nlm.nih.gov/pubmed/35974061
http://dx.doi.org/10.1038/s41598-022-18205-9
work_keys_str_mv AT hongyiyu spredproteinstructuralpropertypredictionusingmsatransformer
AT songjinung spredproteinstructuralpropertypredictionusingmsatransformer
AT kojunsu spredproteinstructuralpropertypredictionusingmsatransformer
AT leejuyong spredproteinstructuralpropertypredictionusingmsatransformer
AT shinwoonghee spredproteinstructuralpropertypredictionusingmsatransformer