Cargando…

Selective UMLS knowledge infusion for biomedical question answering

One of the artificial intelligence applications in the biomedical field is knowledge-intensive question-answering. As domain expertise is particularly crucial in this field, we propose a method for efficiently infusing biomedical knowledge into pretrained language models, ultimately targeting biomed...

Descripción completa

Detalles Bibliográficos
Autores principales: Park, Hyeryun, Son, Jiye, Min, Jeongwon, Choi, Jinwook
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10468517/
https://www.ncbi.nlm.nih.gov/pubmed/37648800
http://dx.doi.org/10.1038/s41598-023-41423-8
_version_ 1785099252590968832
author Park, Hyeryun
Son, Jiye
Min, Jeongwon
Choi, Jinwook
author_facet Park, Hyeryun
Son, Jiye
Min, Jeongwon
Choi, Jinwook
author_sort Park, Hyeryun
collection PubMed
description One of the artificial intelligence applications in the biomedical field is knowledge-intensive question-answering. As domain expertise is particularly crucial in this field, we propose a method for efficiently infusing biomedical knowledge into pretrained language models, ultimately targeting biomedical question-answering. Transferring all semantics of a large knowledge graph into the entire model requires too many parameters, increasing computational cost and time. We investigate an efficient approach that leverages adapters to inject Unified Medical Language System knowledge into pretrained language models, and we question the need to use all semantics in the knowledge graph. This study focuses on strategies of partitioning knowledge graph and either discarding or merging some for more efficient pretraining. According to the results of three biomedical question answering finetuning datasets, the adapters pretrained on semantically partitioned group showed more efficient performance in terms of evaluation metrics, required parameters, and time. The results also show that discarding groups with fewer concepts is a better direction for small datasets, and merging these groups is better for large dataset. Furthermore, the metric results show a slight improvement, demonstrating that the adapter methodology is rather insensitive to the group formulation.
format Online
Article
Text
id pubmed-10468517
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-104685172023-09-01 Selective UMLS knowledge infusion for biomedical question answering Park, Hyeryun Son, Jiye Min, Jeongwon Choi, Jinwook Sci Rep Article One of the artificial intelligence applications in the biomedical field is knowledge-intensive question-answering. As domain expertise is particularly crucial in this field, we propose a method for efficiently infusing biomedical knowledge into pretrained language models, ultimately targeting biomedical question-answering. Transferring all semantics of a large knowledge graph into the entire model requires too many parameters, increasing computational cost and time. We investigate an efficient approach that leverages adapters to inject Unified Medical Language System knowledge into pretrained language models, and we question the need to use all semantics in the knowledge graph. This study focuses on strategies of partitioning knowledge graph and either discarding or merging some for more efficient pretraining. According to the results of three biomedical question answering finetuning datasets, the adapters pretrained on semantically partitioned group showed more efficient performance in terms of evaluation metrics, required parameters, and time. The results also show that discarding groups with fewer concepts is a better direction for small datasets, and merging these groups is better for large dataset. Furthermore, the metric results show a slight improvement, demonstrating that the adapter methodology is rather insensitive to the group formulation. Nature Publishing Group UK 2023-08-30 /pmc/articles/PMC10468517/ /pubmed/37648800 http://dx.doi.org/10.1038/s41598-023-41423-8 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Park, Hyeryun
Son, Jiye
Min, Jeongwon
Choi, Jinwook
Selective UMLS knowledge infusion for biomedical question answering
title Selective UMLS knowledge infusion for biomedical question answering
title_full Selective UMLS knowledge infusion for biomedical question answering
title_fullStr Selective UMLS knowledge infusion for biomedical question answering
title_full_unstemmed Selective UMLS knowledge infusion for biomedical question answering
title_short Selective UMLS knowledge infusion for biomedical question answering
title_sort selective umls knowledge infusion for biomedical question answering
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10468517/
https://www.ncbi.nlm.nih.gov/pubmed/37648800
http://dx.doi.org/10.1038/s41598-023-41423-8
work_keys_str_mv AT parkhyeryun selectiveumlsknowledgeinfusionforbiomedicalquestionanswering
AT sonjiye selectiveumlsknowledgeinfusionforbiomedicalquestionanswering
AT minjeongwon selectiveumlsknowledgeinfusionforbiomedicalquestionanswering
AT choijinwook selectiveumlsknowledgeinfusionforbiomedicalquestionanswering