Cargando…

SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders

Protein-protein interactions (PPIs) are critical for biological processes and predicting the sites of these interactions is useful for both computational and experimental applications. We present a Structure-agnostic Language Transformer and Peptide Prioritization (SaLT&PepPr) pipeline to predic...

Descripción completa

Detalles Bibliográficos
Autores principales: Brixi, Garyk, Ye, Tianzheng, Hong, Lauren, Wang, Tian, Monticello, Connor, Lopez-Barbosa, Natalia, Vincoff, Sophia, Yudistyra, Vivian, Zhao, Lin, Haarer, Elena, Chen, Tianlai, Pertsemlidis, Sarah, Palepu, Kalyan, Bhat, Suhaas, Christopher, Jayani, Li, Xinning, Liu, Tong, Zhang, Sue, Petersen, Lillian, DeLisa, Matthew P., Chatterjee, Pranam
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10598214/
https://www.ncbi.nlm.nih.gov/pubmed/37875551
http://dx.doi.org/10.1038/s42003-023-05464-z
_version_ 1785125507015114752
author Brixi, Garyk
Ye, Tianzheng
Hong, Lauren
Wang, Tian
Monticello, Connor
Lopez-Barbosa, Natalia
Vincoff, Sophia
Yudistyra, Vivian
Zhao, Lin
Haarer, Elena
Chen, Tianlai
Pertsemlidis, Sarah
Palepu, Kalyan
Bhat, Suhaas
Christopher, Jayani
Li, Xinning
Liu, Tong
Zhang, Sue
Petersen, Lillian
DeLisa, Matthew P.
Chatterjee, Pranam
author_facet Brixi, Garyk
Ye, Tianzheng
Hong, Lauren
Wang, Tian
Monticello, Connor
Lopez-Barbosa, Natalia
Vincoff, Sophia
Yudistyra, Vivian
Zhao, Lin
Haarer, Elena
Chen, Tianlai
Pertsemlidis, Sarah
Palepu, Kalyan
Bhat, Suhaas
Christopher, Jayani
Li, Xinning
Liu, Tong
Zhang, Sue
Petersen, Lillian
DeLisa, Matthew P.
Chatterjee, Pranam
author_sort Brixi, Garyk
collection PubMed
description Protein-protein interactions (PPIs) are critical for biological processes and predicting the sites of these interactions is useful for both computational and experimental applications. We present a Structure-agnostic Language Transformer and Peptide Prioritization (SaLT&PepPr) pipeline to predict interaction interfaces from a protein sequence alone for the subsequent generation of peptidic binding motifs. Our model fine-tunes the ESM-2 protein language model (pLM) with a per-position prediction task to identify PPI sites using data from the PDB, and prioritizes motifs which are most likely to be involved within inter-chain binding. By only using amino acid sequence as input, our model is competitive with structural homology-based methods, but exhibits reduced performance compared with deep learning models that input both structural and sequence features. Inspired by our previous results using co-crystals to engineer target-binding “guide” peptides, we curate PPI databases to identify partners for subsequent peptide derivation. Fusing guide peptides to an E3 ubiquitin ligase domain, we demonstrate degradation of endogenous β-catenin, 4E-BP2, and TRIM8, and highlight the nanomolar binding affinity, low off-targeting propensity, and function-altering capability of our best-performing degraders in cancer cells. In total, our study suggests that prioritizing binders from natural interactions via pLMs can enable programmable protein targeting and modulation.
format Online
Article
Text
id pubmed-10598214
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-105982142023-10-26 SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders Brixi, Garyk Ye, Tianzheng Hong, Lauren Wang, Tian Monticello, Connor Lopez-Barbosa, Natalia Vincoff, Sophia Yudistyra, Vivian Zhao, Lin Haarer, Elena Chen, Tianlai Pertsemlidis, Sarah Palepu, Kalyan Bhat, Suhaas Christopher, Jayani Li, Xinning Liu, Tong Zhang, Sue Petersen, Lillian DeLisa, Matthew P. Chatterjee, Pranam Commun Biol Article Protein-protein interactions (PPIs) are critical for biological processes and predicting the sites of these interactions is useful for both computational and experimental applications. We present a Structure-agnostic Language Transformer and Peptide Prioritization (SaLT&PepPr) pipeline to predict interaction interfaces from a protein sequence alone for the subsequent generation of peptidic binding motifs. Our model fine-tunes the ESM-2 protein language model (pLM) with a per-position prediction task to identify PPI sites using data from the PDB, and prioritizes motifs which are most likely to be involved within inter-chain binding. By only using amino acid sequence as input, our model is competitive with structural homology-based methods, but exhibits reduced performance compared with deep learning models that input both structural and sequence features. Inspired by our previous results using co-crystals to engineer target-binding “guide” peptides, we curate PPI databases to identify partners for subsequent peptide derivation. Fusing guide peptides to an E3 ubiquitin ligase domain, we demonstrate degradation of endogenous β-catenin, 4E-BP2, and TRIM8, and highlight the nanomolar binding affinity, low off-targeting propensity, and function-altering capability of our best-performing degraders in cancer cells. In total, our study suggests that prioritizing binders from natural interactions via pLMs can enable programmable protein targeting and modulation. Nature Publishing Group UK 2023-10-24 /pmc/articles/PMC10598214/ /pubmed/37875551 http://dx.doi.org/10.1038/s42003-023-05464-z Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Brixi, Garyk
Ye, Tianzheng
Hong, Lauren
Wang, Tian
Monticello, Connor
Lopez-Barbosa, Natalia
Vincoff, Sophia
Yudistyra, Vivian
Zhao, Lin
Haarer, Elena
Chen, Tianlai
Pertsemlidis, Sarah
Palepu, Kalyan
Bhat, Suhaas
Christopher, Jayani
Li, Xinning
Liu, Tong
Zhang, Sue
Petersen, Lillian
DeLisa, Matthew P.
Chatterjee, Pranam
SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title_full SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title_fullStr SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title_full_unstemmed SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title_short SaLT&PepPr is an interface-predicting language model for designing peptide-guided protein degraders
title_sort salt&peppr is an interface-predicting language model for designing peptide-guided protein degraders
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10598214/
https://www.ncbi.nlm.nih.gov/pubmed/37875551
http://dx.doi.org/10.1038/s42003-023-05464-z
work_keys_str_mv AT brixigaryk saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT yetianzheng saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT honglauren saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT wangtian saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT monticelloconnor saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT lopezbarbosanatalia saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT vincoffsophia saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT yudistyravivian saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT zhaolin saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT haarerelena saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT chentianlai saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT pertsemlidissarah saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT palepukalyan saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT bhatsuhaas saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT christopherjayani saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT lixinning saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT liutong saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT zhangsue saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT petersenlillian saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT delisamatthewp saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders
AT chatterjeepranam saltpepprisaninterfacepredictinglanguagemodelfordesigningpeptideguidedproteindegraders