Cargando…

Hierarchical Attention Neural Network for Event Types to Improve Event Detection

Event detection is an important task in the field of natural language processing, which aims to detect trigger words in a sentence and classify them into specific event types. Event detection tasks suffer from data sparsity and event instances imbalance problems in small-scale datasets. For this rea...

Descripción completa

Detalles Bibliográficos
Autores principales: Jin, Yanliang, Ye, Jinjin, Shen, Liquan, Xiong, Yong, Fan, Lele, Zang, Qingfu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185344/
https://www.ncbi.nlm.nih.gov/pubmed/35684826
http://dx.doi.org/10.3390/s22114202
_version_ 1784724701567778816
author Jin, Yanliang
Ye, Jinjin
Shen, Liquan
Xiong, Yong
Fan, Lele
Zang, Qingfu
author_facet Jin, Yanliang
Ye, Jinjin
Shen, Liquan
Xiong, Yong
Fan, Lele
Zang, Qingfu
author_sort Jin, Yanliang
collection PubMed
description Event detection is an important task in the field of natural language processing, which aims to detect trigger words in a sentence and classify them into specific event types. Event detection tasks suffer from data sparsity and event instances imbalance problems in small-scale datasets. For this reason, the correlation information of event types can be used to alleviate the above problems. In this paper, we design a Hierarchical Attention Neural Network for Event Types (HANN-ET). Specifically, we select Long Short-Term Memory (LSTM) as the semantic encoder and utilize dynamic multi-pooling and the Graph Attention Network (GAT) to enrich the sentence feature. Meanwhile, we build several upper-level event type modules and employ a weighted attention aggregation mechanism to integrate these modules to obtain the correlation event type information. Each upper-level module is completed by a Neural Module Network (NMNs), event types within the same upper-level module can share information, and an attention aggregation mechanism can provide effective bias scores for the trigger word classifier. We conduct extensive experiments on the ACE2005 and the MAVEN datasets, and the results show that our approach outperforms previous state-of-the-art methods and achieves the competitive F1 scores of 78.9% on the ACE2005 dataset and 68.8% on the MAVEN dataset.
format Online
Article
Text
id pubmed-9185344
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-91853442022-06-11 Hierarchical Attention Neural Network for Event Types to Improve Event Detection Jin, Yanliang Ye, Jinjin Shen, Liquan Xiong, Yong Fan, Lele Zang, Qingfu Sensors (Basel) Article Event detection is an important task in the field of natural language processing, which aims to detect trigger words in a sentence and classify them into specific event types. Event detection tasks suffer from data sparsity and event instances imbalance problems in small-scale datasets. For this reason, the correlation information of event types can be used to alleviate the above problems. In this paper, we design a Hierarchical Attention Neural Network for Event Types (HANN-ET). Specifically, we select Long Short-Term Memory (LSTM) as the semantic encoder and utilize dynamic multi-pooling and the Graph Attention Network (GAT) to enrich the sentence feature. Meanwhile, we build several upper-level event type modules and employ a weighted attention aggregation mechanism to integrate these modules to obtain the correlation event type information. Each upper-level module is completed by a Neural Module Network (NMNs), event types within the same upper-level module can share information, and an attention aggregation mechanism can provide effective bias scores for the trigger word classifier. We conduct extensive experiments on the ACE2005 and the MAVEN datasets, and the results show that our approach outperforms previous state-of-the-art methods and achieves the competitive F1 scores of 78.9% on the ACE2005 dataset and 68.8% on the MAVEN dataset. MDPI 2022-05-31 /pmc/articles/PMC9185344/ /pubmed/35684826 http://dx.doi.org/10.3390/s22114202 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jin, Yanliang
Ye, Jinjin
Shen, Liquan
Xiong, Yong
Fan, Lele
Zang, Qingfu
Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title_full Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title_fullStr Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title_full_unstemmed Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title_short Hierarchical Attention Neural Network for Event Types to Improve Event Detection
title_sort hierarchical attention neural network for event types to improve event detection
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185344/
https://www.ncbi.nlm.nih.gov/pubmed/35684826
http://dx.doi.org/10.3390/s22114202
work_keys_str_mv AT jinyanliang hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection
AT yejinjin hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection
AT shenliquan hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection
AT xiongyong hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection
AT fanlele hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection
AT zangqingfu hierarchicalattentionneuralnetworkforeventtypestoimproveeventdetection