Cargando…
B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism
BACKGROUND: The main task of medical entity disambiguation is to link mentions, such as diseases, drugs, or complications, to standard entities in the target knowledge base. To our knowledge, models based on Bidirectional Encoder Representations from Transformers (BERT) have achieved good results in...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10021986/ https://www.ncbi.nlm.nih.gov/pubmed/36927359 http://dx.doi.org/10.1186/s12859-023-05209-z |
_version_ | 1784908627824345088 |
---|---|
author | Yang, Siyu Zhang, Peiliang Che, Chao Zhong, Zhaoqian |
author_facet | Yang, Siyu Zhang, Peiliang Che, Chao Zhong, Zhaoqian |
author_sort | Yang, Siyu |
collection | PubMed |
description | BACKGROUND: The main task of medical entity disambiguation is to link mentions, such as diseases, drugs, or complications, to standard entities in the target knowledge base. To our knowledge, models based on Bidirectional Encoder Representations from Transformers (BERT) have achieved good results in this task. Unfortunately, these models only consider text in the current document, fail to capture dependencies with other documents, and lack sufficient mining of hidden information in contextual texts. RESULTS: We propose B-LBConA, which is based on Bio-LinkBERT and context-aware mechanism. Specifically, B-LBConA first utilizes Bio-LinkBERT, which is capable of learning cross-document dependencies, to obtain embedding representations of mentions and candidate entities. Then, cross-attention is used to capture the interaction information of mention-to-entity and entity-to-mention. Finally, B-LBConA incorporates disambiguation clues about the relevance between the mention context and candidate entities via the context-aware mechanism. CONCLUSIONS: Experiment results on three publicly available datasets, NCBI, ADR and ShARe/CLEF, show that B-LBConA achieves a signifcantly more accurate performance compared with existing models. |
format | Online Article Text |
id | pubmed-10021986 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-100219862023-03-18 B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism Yang, Siyu Zhang, Peiliang Che, Chao Zhong, Zhaoqian BMC Bioinformatics Research BACKGROUND: The main task of medical entity disambiguation is to link mentions, such as diseases, drugs, or complications, to standard entities in the target knowledge base. To our knowledge, models based on Bidirectional Encoder Representations from Transformers (BERT) have achieved good results in this task. Unfortunately, these models only consider text in the current document, fail to capture dependencies with other documents, and lack sufficient mining of hidden information in contextual texts. RESULTS: We propose B-LBConA, which is based on Bio-LinkBERT and context-aware mechanism. Specifically, B-LBConA first utilizes Bio-LinkBERT, which is capable of learning cross-document dependencies, to obtain embedding representations of mentions and candidate entities. Then, cross-attention is used to capture the interaction information of mention-to-entity and entity-to-mention. Finally, B-LBConA incorporates disambiguation clues about the relevance between the mention context and candidate entities via the context-aware mechanism. CONCLUSIONS: Experiment results on three publicly available datasets, NCBI, ADR and ShARe/CLEF, show that B-LBConA achieves a signifcantly more accurate performance compared with existing models. BioMed Central 2023-03-16 /pmc/articles/PMC10021986/ /pubmed/36927359 http://dx.doi.org/10.1186/s12859-023-05209-z Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Research Yang, Siyu Zhang, Peiliang Che, Chao Zhong, Zhaoqian B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title | B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title_full | B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title_fullStr | B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title_full_unstemmed | B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title_short | B-LBConA: a medical entity disambiguation model based on Bio-LinkBERT and context-aware mechanism |
title_sort | b-lbcona: a medical entity disambiguation model based on bio-linkbert and context-aware mechanism |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10021986/ https://www.ncbi.nlm.nih.gov/pubmed/36927359 http://dx.doi.org/10.1186/s12859-023-05209-z |
work_keys_str_mv | AT yangsiyu blbconaamedicalentitydisambiguationmodelbasedonbiolinkbertandcontextawaremechanism AT zhangpeiliang blbconaamedicalentitydisambiguationmodelbasedonbiolinkbertandcontextawaremechanism AT chechao blbconaamedicalentitydisambiguationmodelbasedonbiolinkbertandcontextawaremechanism AT zhongzhaoqian blbconaamedicalentitydisambiguationmodelbasedonbiolinkbertandcontextawaremechanism |