Cargando…
Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts
For any molecule, network, or process of interest, keeping up with new publications on these is becoming increasingly difficult. For many cellular processes, the amount molecules and their interactions that need to be considered can be very large. Automated mining of publications can support large-s...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8615611/ https://www.ncbi.nlm.nih.gov/pubmed/34827589 http://dx.doi.org/10.3390/biom11111591 |
_version_ | 1784604146147524608 |
---|---|
author | Srivastava, Prashant Bej, Saptarshi Yordanova, Kristina Wolkenhauer, Olaf |
author_facet | Srivastava, Prashant Bej, Saptarshi Yordanova, Kristina Wolkenhauer, Olaf |
author_sort | Srivastava, Prashant |
collection | PubMed |
description | For any molecule, network, or process of interest, keeping up with new publications on these is becoming increasingly difficult. For many cellular processes, the amount molecules and their interactions that need to be considered can be very large. Automated mining of publications can support large-scale molecular interaction maps and database curation. Text mining and Natural-Language-Processing (NLP)-based techniques are finding their applications in mining the biological literature, handling problems such as Named Entity Recognition (NER) and Relationship Extraction (RE). Both rule-based and Machine-Learning (ML)-based NLP approaches have been popular in this context, with multiple research and review articles examining the scope of such models in Biological Literature Mining (BLM). In this review article, we explore self-attention-based models, a special type of Neural-Network (NN)-based architecture that has recently revitalized the field of NLP, applied to biological texts. We cover self-attention models operating either at the sentence level or an abstract level, in the context of molecular interaction extraction, published from 2019 onwards. We conducted a comparative study of the models in terms of their architecture. Moreover, we also discuss some limitations in the field of BLM that identifies opportunities for the extraction of molecular interactions from biological text. |
format | Online Article Text |
id | pubmed-8615611 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-86156112021-11-26 Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts Srivastava, Prashant Bej, Saptarshi Yordanova, Kristina Wolkenhauer, Olaf Biomolecules Review For any molecule, network, or process of interest, keeping up with new publications on these is becoming increasingly difficult. For many cellular processes, the amount molecules and their interactions that need to be considered can be very large. Automated mining of publications can support large-scale molecular interaction maps and database curation. Text mining and Natural-Language-Processing (NLP)-based techniques are finding their applications in mining the biological literature, handling problems such as Named Entity Recognition (NER) and Relationship Extraction (RE). Both rule-based and Machine-Learning (ML)-based NLP approaches have been popular in this context, with multiple research and review articles examining the scope of such models in Biological Literature Mining (BLM). In this review article, we explore self-attention-based models, a special type of Neural-Network (NN)-based architecture that has recently revitalized the field of NLP, applied to biological texts. We cover self-attention models operating either at the sentence level or an abstract level, in the context of molecular interaction extraction, published from 2019 onwards. We conducted a comparative study of the models in terms of their architecture. Moreover, we also discuss some limitations in the field of BLM that identifies opportunities for the extraction of molecular interactions from biological text. MDPI 2021-10-27 /pmc/articles/PMC8615611/ /pubmed/34827589 http://dx.doi.org/10.3390/biom11111591 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Review Srivastava, Prashant Bej, Saptarshi Yordanova, Kristina Wolkenhauer, Olaf Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title | Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title_full | Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title_fullStr | Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title_full_unstemmed | Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title_short | Self-Attention-Based Models for the Extraction of Molecular Interactions from Biological Texts |
title_sort | self-attention-based models for the extraction of molecular interactions from biological texts |
topic | Review |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8615611/ https://www.ncbi.nlm.nih.gov/pubmed/34827589 http://dx.doi.org/10.3390/biom11111591 |
work_keys_str_mv | AT srivastavaprashant selfattentionbasedmodelsfortheextractionofmolecularinteractionsfrombiologicaltexts AT bejsaptarshi selfattentionbasedmodelsfortheextractionofmolecularinteractionsfrombiologicaltexts AT yordanovakristina selfattentionbasedmodelsfortheextractionofmolecularinteractionsfrombiologicaltexts AT wolkenhauerolaf selfattentionbasedmodelsfortheextractionofmolecularinteractionsfrombiologicaltexts |