Cargando…

Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics

Translation elongation is essential for maintaining cellular proteostasis, and alterations in the translational landscape are associated with a range of diseases. Ribosome profiling allows detailed measurement of translation at genome scale. However, it remains unclear how to disentangle biological...

Descripción completa

Detalles Bibliográficos
Autores principales: Shao, Bin, Yan, Jiawei, Zhang, Jing, Buskirk, Allen R.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Cold Spring Harbor Laboratory 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10168224/
https://www.ncbi.nlm.nih.gov/pubmed/37163112
http://dx.doi.org/10.1101/2023.04.24.538053
_version_ 1785038818958639104
author Shao, Bin
Yan, Jiawei
Zhang, Jing
Buskirk, Allen R.
author_facet Shao, Bin
Yan, Jiawei
Zhang, Jing
Buskirk, Allen R.
author_sort Shao, Bin
collection PubMed
description Translation elongation is essential for maintaining cellular proteostasis, and alterations in the translational landscape are associated with a range of diseases. Ribosome profiling allows detailed measurement of translation at genome scale. However, it remains unclear how to disentangle biological variations from technical artifacts and identify sequence determinant of translation dysregulation. Here we present Riboformer, a deep learning-based framework for modeling context-dependent changes in translation dynamics. Riboformer leverages the transformer architecture to accurately predict ribosome densities at codon resolution. It corrects experimental artifacts in previously unseen datasets, reveals subtle differences in synonymous codon translation and uncovers a bottleneck in protein synthesis. Further, we show that Riboformer can be combined with in silico mutagenesis analysis to identify sequence motifs that contribute to ribosome stalling across various biological contexts, including aging and viral infection. Our tool offers a context-aware and interpretable approach for standardizing ribosome profiling datasets and elucidating the regulatory basis of translation kinetics.
format Online
Article
Text
id pubmed-10168224
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Cold Spring Harbor Laboratory
record_format MEDLINE/PubMed
spelling pubmed-101682242023-05-10 Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics Shao, Bin Yan, Jiawei Zhang, Jing Buskirk, Allen R. bioRxiv Article Translation elongation is essential for maintaining cellular proteostasis, and alterations in the translational landscape are associated with a range of diseases. Ribosome profiling allows detailed measurement of translation at genome scale. However, it remains unclear how to disentangle biological variations from technical artifacts and identify sequence determinant of translation dysregulation. Here we present Riboformer, a deep learning-based framework for modeling context-dependent changes in translation dynamics. Riboformer leverages the transformer architecture to accurately predict ribosome densities at codon resolution. It corrects experimental artifacts in previously unseen datasets, reveals subtle differences in synonymous codon translation and uncovers a bottleneck in protein synthesis. Further, we show that Riboformer can be combined with in silico mutagenesis analysis to identify sequence motifs that contribute to ribosome stalling across various biological contexts, including aging and viral infection. Our tool offers a context-aware and interpretable approach for standardizing ribosome profiling datasets and elucidating the regulatory basis of translation kinetics. Cold Spring Harbor Laboratory 2023-04-28 /pmc/articles/PMC10168224/ /pubmed/37163112 http://dx.doi.org/10.1101/2023.04.24.538053 Text en https://creativecommons.org/licenses/by-nc/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (https://creativecommons.org/licenses/by-nc/4.0/) , which allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator.
spellingShingle Article
Shao, Bin
Yan, Jiawei
Zhang, Jing
Buskirk, Allen R.
Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title_full Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title_fullStr Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title_full_unstemmed Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title_short Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics
title_sort riboformer: a deep learning framework for predicting context-dependent translation dynamics
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10168224/
https://www.ncbi.nlm.nih.gov/pubmed/37163112
http://dx.doi.org/10.1101/2023.04.24.538053
work_keys_str_mv AT shaobin riboformeradeeplearningframeworkforpredictingcontextdependenttranslationdynamics
AT yanjiawei riboformeradeeplearningframeworkforpredictingcontextdependenttranslationdynamics
AT zhangjing riboformeradeeplearningframeworkforpredictingcontextdependenttranslationdynamics
AT buskirkallenr riboformeradeeplearningframeworkforpredictingcontextdependenttranslationdynamics