Cargando…
Inter-sentence and Implicit Causality Extraction from Chinese Corpus
Automatically extracting causal relations from texts is a challenging task in Natural Language Processing (NLP). Most existing methods focus on extracting intra-sentence or explicit causality, while neglecting the causal relations that expressed implicitly or hidden in inter-sentences. In this paper...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206172/ http://dx.doi.org/10.1007/978-3-030-47426-3_57 |
_version_ | 1783530361896042496 |
---|---|
author | Jin, Xianxian Wang, Xinzhi Luo, Xiangfeng Huang, Subin Gu, Shengwei |
author_facet | Jin, Xianxian Wang, Xinzhi Luo, Xiangfeng Huang, Subin Gu, Shengwei |
author_sort | Jin, Xianxian |
collection | PubMed |
description | Automatically extracting causal relations from texts is a challenging task in Natural Language Processing (NLP). Most existing methods focus on extracting intra-sentence or explicit causality, while neglecting the causal relations that expressed implicitly or hidden in inter-sentences. In this paper, we propose Cascaded multi-Structure Neural Network (CSNN), a novel and unified model that extract inter-sentence or implicit causal relations from Chinese Corpus, without relying on external knowledge. The model employs Convolutional Neural Network (CNN) to capture important features as well as causal structural pattern. Self-attention mechanism is designed to mine semantic and relevant characteristics between different features. The output of CNN and self-attention structure are concatenated as higher-level phrase representations. Then Conditional Random Field (CRF) layer is employed to calculate the label of each word in inter-sentence or implicit causal relation sentences, which improves the performance of inter-sentence or implicit causality extraction. Experimental results show that the proposed model achieves state-of-the-art results, improved on three datasets, when compared with other methods. |
format | Online Article Text |
id | pubmed-7206172 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-72061722020-05-08 Inter-sentence and Implicit Causality Extraction from Chinese Corpus Jin, Xianxian Wang, Xinzhi Luo, Xiangfeng Huang, Subin Gu, Shengwei Advances in Knowledge Discovery and Data Mining Article Automatically extracting causal relations from texts is a challenging task in Natural Language Processing (NLP). Most existing methods focus on extracting intra-sentence or explicit causality, while neglecting the causal relations that expressed implicitly or hidden in inter-sentences. In this paper, we propose Cascaded multi-Structure Neural Network (CSNN), a novel and unified model that extract inter-sentence or implicit causal relations from Chinese Corpus, without relying on external knowledge. The model employs Convolutional Neural Network (CNN) to capture important features as well as causal structural pattern. Self-attention mechanism is designed to mine semantic and relevant characteristics between different features. The output of CNN and self-attention structure are concatenated as higher-level phrase representations. Then Conditional Random Field (CRF) layer is employed to calculate the label of each word in inter-sentence or implicit causal relation sentences, which improves the performance of inter-sentence or implicit causality extraction. Experimental results show that the proposed model achieves state-of-the-art results, improved on three datasets, when compared with other methods. 2020-04-17 /pmc/articles/PMC7206172/ http://dx.doi.org/10.1007/978-3-030-47426-3_57 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Jin, Xianxian Wang, Xinzhi Luo, Xiangfeng Huang, Subin Gu, Shengwei Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title | Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title_full | Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title_fullStr | Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title_full_unstemmed | Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title_short | Inter-sentence and Implicit Causality Extraction from Chinese Corpus |
title_sort | inter-sentence and implicit causality extraction from chinese corpus |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206172/ http://dx.doi.org/10.1007/978-3-030-47426-3_57 |
work_keys_str_mv | AT jinxianxian intersentenceandimplicitcausalityextractionfromchinesecorpus AT wangxinzhi intersentenceandimplicitcausalityextractionfromchinesecorpus AT luoxiangfeng intersentenceandimplicitcausalityextractionfromchinesecorpus AT huangsubin intersentenceandimplicitcausalityextractionfromchinesecorpus AT gushengwei intersentenceandimplicitcausalityextractionfromchinesecorpus |