Cargando…

PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization

Prediction of therapeutic peptide is a significant step for the discovery of promising therapeutic drugs. Most of the existing studies have focused on the mono-functional therapeutic peptide prediction. However, the number of multi-functional therapeutic peptides (MFTP) is growing rapidly, which req...

Descripción completa

Detalles Bibliográficos
Autores principales: Yan, Wenhui, Tang, Wending, Wang, Lihua, Bin, Yannan, Xia, Junfeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9499272/
https://www.ncbi.nlm.nih.gov/pubmed/36094961
http://dx.doi.org/10.1371/journal.pcbi.1010511
_version_ 1784794956304482304
author Yan, Wenhui
Tang, Wending
Wang, Lihua
Bin, Yannan
Xia, Junfeng
author_facet Yan, Wenhui
Tang, Wending
Wang, Lihua
Bin, Yannan
Xia, Junfeng
author_sort Yan, Wenhui
collection PubMed
description Prediction of therapeutic peptide is a significant step for the discovery of promising therapeutic drugs. Most of the existing studies have focused on the mono-functional therapeutic peptide prediction. However, the number of multi-functional therapeutic peptides (MFTP) is growing rapidly, which requires new computational schemes to be proposed to facilitate MFTP discovery. In this study, based on multi-head self-attention mechanism and class weight optimization algorithm, we propose a novel model called PrMFTP for MFTP prediction. PrMFTP exploits multi-scale convolutional neural network, bi-directional long short-term memory, and multi-head self-attention mechanisms to fully extract and learn informative features of peptide sequence to predict MFTP. In addition, we design a class weight optimization scheme to address the problem of label imbalanced data. Comprehensive evaluation demonstrate that PrMFTP is superior to other state-of-the-art computational methods for predicting MFTP. We provide a user-friendly web server of PrMFTP, which is available at http://bioinfo.ahu.edu.cn/PrMFTP.
format Online
Article
Text
id pubmed-9499272
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-94992722022-09-23 PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization Yan, Wenhui Tang, Wending Wang, Lihua Bin, Yannan Xia, Junfeng PLoS Comput Biol Research Article Prediction of therapeutic peptide is a significant step for the discovery of promising therapeutic drugs. Most of the existing studies have focused on the mono-functional therapeutic peptide prediction. However, the number of multi-functional therapeutic peptides (MFTP) is growing rapidly, which requires new computational schemes to be proposed to facilitate MFTP discovery. In this study, based on multi-head self-attention mechanism and class weight optimization algorithm, we propose a novel model called PrMFTP for MFTP prediction. PrMFTP exploits multi-scale convolutional neural network, bi-directional long short-term memory, and multi-head self-attention mechanisms to fully extract and learn informative features of peptide sequence to predict MFTP. In addition, we design a class weight optimization scheme to address the problem of label imbalanced data. Comprehensive evaluation demonstrate that PrMFTP is superior to other state-of-the-art computational methods for predicting MFTP. We provide a user-friendly web server of PrMFTP, which is available at http://bioinfo.ahu.edu.cn/PrMFTP. Public Library of Science 2022-09-12 /pmc/articles/PMC9499272/ /pubmed/36094961 http://dx.doi.org/10.1371/journal.pcbi.1010511 Text en © 2022 Yan et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Yan, Wenhui
Tang, Wending
Wang, Lihua
Bin, Yannan
Xia, Junfeng
PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title_full PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title_fullStr PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title_full_unstemmed PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title_short PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
title_sort prmftp: multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9499272/
https://www.ncbi.nlm.nih.gov/pubmed/36094961
http://dx.doi.org/10.1371/journal.pcbi.1010511
work_keys_str_mv AT yanwenhui prmftpmultifunctionaltherapeuticpeptidespredictionbasedonmultiheadselfattentionmechanismandclassweightoptimization
AT tangwending prmftpmultifunctionaltherapeuticpeptidespredictionbasedonmultiheadselfattentionmechanismandclassweightoptimization
AT wanglihua prmftpmultifunctionaltherapeuticpeptidespredictionbasedonmultiheadselfattentionmechanismandclassweightoptimization
AT binyannan prmftpmultifunctionaltherapeuticpeptidespredictionbasedonmultiheadselfattentionmechanismandclassweightoptimization
AT xiajunfeng prmftpmultifunctionaltherapeuticpeptidespredictionbasedonmultiheadselfattentionmechanismandclassweightoptimization