Cargando…
Extracting chemical–protein relations using attention-based neural networks
Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical–protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, r...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6174551/ https://www.ncbi.nlm.nih.gov/pubmed/30295724 http://dx.doi.org/10.1093/database/bay102 |
_version_ | 1783361294948106240 |
---|---|
author | Liu, Sijia Shen, Feichen Komandur Elayavilli, Ravikumar Wang, Yanshan Rastegar-Mojarad, Majid Chaudhary, Vipin Liu, Hongfang |
author_facet | Liu, Sijia Shen, Feichen Komandur Elayavilli, Ravikumar Wang, Yanshan Rastegar-Mojarad, Majid Chaudhary, Vipin Liu, Hongfang |
author_sort | Liu, Sijia |
collection | PubMed |
description | Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical–protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical–protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attention and the ATT-gated recurrent unit (ATT-GRU) achieves the best performing micro average F1 score of 0.527 on the test set among the tested DNNs. In addition, the result of word-level attention weights also shows that attention mechanism is effective on selecting the most important trigger words when trained with semantic relation labels without the need of semantic parsing and feature engineering. The source code of this work is available at https://github.com/ohnlp/att-chemprot. |
format | Online Article Text |
id | pubmed-6174551 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Oxford University Press |
record_format | MEDLINE/PubMed |
spelling | pubmed-61745512018-10-11 Extracting chemical–protein relations using attention-based neural networks Liu, Sijia Shen, Feichen Komandur Elayavilli, Ravikumar Wang, Yanshan Rastegar-Mojarad, Majid Chaudhary, Vipin Liu, Hongfang Database (Oxford) Original Article Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical–protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical–protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attention and the ATT-gated recurrent unit (ATT-GRU) achieves the best performing micro average F1 score of 0.527 on the test set among the tested DNNs. In addition, the result of word-level attention weights also shows that attention mechanism is effective on selecting the most important trigger words when trained with semantic relation labels without the need of semantic parsing and feature engineering. The source code of this work is available at https://github.com/ohnlp/att-chemprot. Oxford University Press 2018-10-08 /pmc/articles/PMC6174551/ /pubmed/30295724 http://dx.doi.org/10.1093/database/bay102 Text en © The Author(s) 2018. Published by Oxford University Press. http://creativecommons.org/licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Original Article Liu, Sijia Shen, Feichen Komandur Elayavilli, Ravikumar Wang, Yanshan Rastegar-Mojarad, Majid Chaudhary, Vipin Liu, Hongfang Extracting chemical–protein relations using attention-based neural networks |
title | Extracting chemical–protein relations using attention-based neural networks |
title_full | Extracting chemical–protein relations using attention-based neural networks |
title_fullStr | Extracting chemical–protein relations using attention-based neural networks |
title_full_unstemmed | Extracting chemical–protein relations using attention-based neural networks |
title_short | Extracting chemical–protein relations using attention-based neural networks |
title_sort | extracting chemical–protein relations using attention-based neural networks |
topic | Original Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6174551/ https://www.ncbi.nlm.nih.gov/pubmed/30295724 http://dx.doi.org/10.1093/database/bay102 |
work_keys_str_mv | AT liusijia extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT shenfeichen extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT komandurelayavilliravikumar extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT wangyanshan extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT rastegarmojaradmajid extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT chaudharyvipin extractingchemicalproteinrelationsusingattentionbasedneuralnetworks AT liuhongfang extractingchemicalproteinrelationsusingattentionbasedneuralnetworks |