Cargando…
Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning
It is well known that histone modifications play an important part in various chromatin-dependent processes such as DNA replication, repair, and transcription. Using computational models to predict gene expression based on histone modifications has been intensively studied. However, the accuracy of...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9797047/ https://www.ncbi.nlm.nih.gov/pubmed/36588793 http://dx.doi.org/10.3389/fgene.2022.1081842 |
_version_ | 1784860616966537216 |
---|---|
author | Chen, Yuchi Xie, Minzhu Wen, Jie |
author_facet | Chen, Yuchi Xie, Minzhu Wen, Jie |
author_sort | Chen, Yuchi |
collection | PubMed |
description | It is well known that histone modifications play an important part in various chromatin-dependent processes such as DNA replication, repair, and transcription. Using computational models to predict gene expression based on histone modifications has been intensively studied. However, the accuracy of the proposed models still has room for improvement, especially in cross-cell lines gene expression prediction. In the work, we proposed a new model TransferChrome to predict gene expression from histone modifications based on deep learning. The model uses a densely connected convolutional network to capture the features of histone modifications data and uses self-attention layers to aggregate global features of the data. For cross-cell lines gene expression prediction, TransferChrome adopts transfer learning to improve prediction accuracy. We trained and tested our model on 56 different cell lines from the REMC database. The experimental results show that our model achieved an average Area Under the Curve (AUC) score of 84.79%. Compared to three state-of-the-art models, TransferChrome improves the prediction performance on most cell lines. The experiments of cross-cell lines gene expression prediction show that TransferChrome performs best and is an efficient model for predicting cross-cell lines gene expression. |
format | Online Article Text |
id | pubmed-9797047 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-97970472022-12-29 Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning Chen, Yuchi Xie, Minzhu Wen, Jie Front Genet Genetics It is well known that histone modifications play an important part in various chromatin-dependent processes such as DNA replication, repair, and transcription. Using computational models to predict gene expression based on histone modifications has been intensively studied. However, the accuracy of the proposed models still has room for improvement, especially in cross-cell lines gene expression prediction. In the work, we proposed a new model TransferChrome to predict gene expression from histone modifications based on deep learning. The model uses a densely connected convolutional network to capture the features of histone modifications data and uses self-attention layers to aggregate global features of the data. For cross-cell lines gene expression prediction, TransferChrome adopts transfer learning to improve prediction accuracy. We trained and tested our model on 56 different cell lines from the REMC database. The experimental results show that our model achieved an average Area Under the Curve (AUC) score of 84.79%. Compared to three state-of-the-art models, TransferChrome improves the prediction performance on most cell lines. The experiments of cross-cell lines gene expression prediction show that TransferChrome performs best and is an efficient model for predicting cross-cell lines gene expression. Frontiers Media S.A. 2022-12-14 /pmc/articles/PMC9797047/ /pubmed/36588793 http://dx.doi.org/10.3389/fgene.2022.1081842 Text en Copyright © 2022 Chen, Xie and Wen. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Genetics Chen, Yuchi Xie, Minzhu Wen, Jie Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title | Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title_full | Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title_fullStr | Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title_full_unstemmed | Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title_short | Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
title_sort | predicting gene expression from histone modifications with self-attention based neural networks and transfer learning |
topic | Genetics |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9797047/ https://www.ncbi.nlm.nih.gov/pubmed/36588793 http://dx.doi.org/10.3389/fgene.2022.1081842 |
work_keys_str_mv | AT chenyuchi predictinggeneexpressionfromhistonemodificationswithselfattentionbasedneuralnetworksandtransferlearning AT xieminzhu predictinggeneexpressionfromhistonemodificationswithselfattentionbasedneuralnetworksandtransferlearning AT wenjie predictinggeneexpressionfromhistonemodificationswithselfattentionbasedneuralnetworksandtransferlearning |