Cargando…
Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction
BACKGROUND: Tree-structured neural networks have shown promise in extracting lexical representations of sentence syntactic structures, particularly in the detection of event triggers using recursive neural networks. METHODS: In this study, we introduce an attention mechanism into Child-Sum Tree-LSTM...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10268412/ https://www.ncbi.nlm.nih.gov/pubmed/37322443 http://dx.doi.org/10.1186/s12859-023-05336-7 |
_version_ | 1785059086838005760 |
---|---|
author | Wang, Lei Cao, Han Yuan, Liu Guo, Xiaoxu Cui, Yachao |
author_facet | Wang, Lei Cao, Han Yuan, Liu Guo, Xiaoxu Cui, Yachao |
author_sort | Wang, Lei |
collection | PubMed |
description | BACKGROUND: Tree-structured neural networks have shown promise in extracting lexical representations of sentence syntactic structures, particularly in the detection of event triggers using recursive neural networks. METHODS: In this study, we introduce an attention mechanism into Child-Sum Tree-LSTMs for the detection of biomedical event triggers. We incorporate previous researches on assigning attention weights to adjacent nodes and integrate this mechanism into Child-Sum Tree-LSTMs to improve the detection of event trigger words. We also address a limitation of shallow syntactic dependencies in Child-Sum Tree-LSTMs by integrating deep syntactic dependencies to enhance the effect of the attention mechanism. RESULTS: Our proposed model, which integrates an enhanced attention mechanism into Tree-LSTM, shows the best performance for the MLEE and BioNLP’09 datasets. Moreover, our model outperforms almost all complex event categories for the BioNLP’09/11/13 test set. CONCLUSION: We evaluate the performance of our proposed model with the MLEE and BioNLP datasets and demonstrate the advantage of an enhanced attention mechanism in detecting biomedical event trigger words. |
format | Online Article Text |
id | pubmed-10268412 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-102684122023-06-15 Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction Wang, Lei Cao, Han Yuan, Liu Guo, Xiaoxu Cui, Yachao BMC Bioinformatics Research BACKGROUND: Tree-structured neural networks have shown promise in extracting lexical representations of sentence syntactic structures, particularly in the detection of event triggers using recursive neural networks. METHODS: In this study, we introduce an attention mechanism into Child-Sum Tree-LSTMs for the detection of biomedical event triggers. We incorporate previous researches on assigning attention weights to adjacent nodes and integrate this mechanism into Child-Sum Tree-LSTMs to improve the detection of event trigger words. We also address a limitation of shallow syntactic dependencies in Child-Sum Tree-LSTMs by integrating deep syntactic dependencies to enhance the effect of the attention mechanism. RESULTS: Our proposed model, which integrates an enhanced attention mechanism into Tree-LSTM, shows the best performance for the MLEE and BioNLP’09 datasets. Moreover, our model outperforms almost all complex event categories for the BioNLP’09/11/13 test set. CONCLUSION: We evaluate the performance of our proposed model with the MLEE and BioNLP datasets and demonstrate the advantage of an enhanced attention mechanism in detecting biomedical event trigger words. BioMed Central 2023-06-15 /pmc/articles/PMC10268412/ /pubmed/37322443 http://dx.doi.org/10.1186/s12859-023-05336-7 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Research Wang, Lei Cao, Han Yuan, Liu Guo, Xiaoxu Cui, Yachao Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title | Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title_full | Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title_fullStr | Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title_full_unstemmed | Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title_short | Child-Sum EATree-LSTMs: enhanced attentive Child-Sum Tree-LSTMs for biomedical event extraction |
title_sort | child-sum eatree-lstms: enhanced attentive child-sum tree-lstms for biomedical event extraction |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10268412/ https://www.ncbi.nlm.nih.gov/pubmed/37322443 http://dx.doi.org/10.1186/s12859-023-05336-7 |
work_keys_str_mv | AT wanglei childsumeatreelstmsenhancedattentivechildsumtreelstmsforbiomedicaleventextraction AT caohan childsumeatreelstmsenhancedattentivechildsumtreelstmsforbiomedicaleventextraction AT yuanliu childsumeatreelstmsenhancedattentivechildsumtreelstmsforbiomedicaleventextraction AT guoxiaoxu childsumeatreelstmsenhancedattentivechildsumtreelstmsforbiomedicaleventextraction AT cuiyachao childsumeatreelstmsenhancedattentivechildsumtreelstmsforbiomedicaleventextraction |