Cargando…
Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus
BACKGROUND: Terminology integration at the scale of the UMLS Metathesaurus (i.e., over 200 source vocabularies) remains challenging despite recent advances in ontology alignment techniques based on neural networks. OBJECTIVES: To improve the performance of the neural network architecture we develope...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9484765/ https://www.ncbi.nlm.nih.gov/pubmed/35672982 http://dx.doi.org/10.3233/SHTI220043 |
_version_ | 1784791943067205632 |
---|---|
author | Nguyen, Vinh Bodenreider, Olivier |
author_facet | Nguyen, Vinh Bodenreider, Olivier |
author_sort | Nguyen, Vinh |
collection | PubMed |
description | BACKGROUND: Terminology integration at the scale of the UMLS Metathesaurus (i.e., over 200 source vocabularies) remains challenging despite recent advances in ontology alignment techniques based on neural networks. OBJECTIVES: To improve the performance of the neural network architecture we developed for predicting synonymy between terms in the UMLS Metathesaurus, specifically through the addition of an attention layer. METHODS: We modify our original Siamese neural network architecture with Long-Short Term Memory (LSTM) and create two variants by (1) adding an attention layer on top of the existing LSTM, and (2) replacing the existing LSTM layer by an attention layer. RESULTS: Adding an attention layer to the LSTM layer resulted in increasing precision to 92.38% (+3.63%) and F1 score to 91,74% (+1.13%), with limited impact on recall at 91.12% (−1.42%). CONCLUSIONS: Although limited, this increase in precision substantially reduces the false positive rate and minimizes the need for manual curation. |
format | Online Article Text |
id | pubmed-9484765 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
record_format | MEDLINE/PubMed |
spelling | pubmed-94847652022-09-19 Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus Nguyen, Vinh Bodenreider, Olivier Stud Health Technol Inform Article BACKGROUND: Terminology integration at the scale of the UMLS Metathesaurus (i.e., over 200 source vocabularies) remains challenging despite recent advances in ontology alignment techniques based on neural networks. OBJECTIVES: To improve the performance of the neural network architecture we developed for predicting synonymy between terms in the UMLS Metathesaurus, specifically through the addition of an attention layer. METHODS: We modify our original Siamese neural network architecture with Long-Short Term Memory (LSTM) and create two variants by (1) adding an attention layer on top of the existing LSTM, and (2) replacing the existing LSTM layer by an attention layer. RESULTS: Adding an attention layer to the LSTM layer resulted in increasing precision to 92.38% (+3.63%) and F1 score to 91,74% (+1.13%), with limited impact on recall at 91.12% (−1.42%). CONCLUSIONS: Although limited, this increase in precision substantially reduces the false positive rate and minimizes the need for manual curation. 2022-06-06 /pmc/articles/PMC9484765/ /pubmed/35672982 http://dx.doi.org/10.3233/SHTI220043 Text en https://creativecommons.org/licenses/by-nc/4.0/This article is published online with Open Access by IOS Press and distributed under the terms of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/) . |
spellingShingle | Article Nguyen, Vinh Bodenreider, Olivier Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title | Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title_full | Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title_fullStr | Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title_full_unstemmed | Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title_short | Adding an Attention Layer Improves the Performance of a Neural Network Architecture for Synonymy Prediction in the UMLS Metathesaurus |
title_sort | adding an attention layer improves the performance of a neural network architecture for synonymy prediction in the umls metathesaurus |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9484765/ https://www.ncbi.nlm.nih.gov/pubmed/35672982 http://dx.doi.org/10.3233/SHTI220043 |
work_keys_str_mv | AT nguyenvinh addinganattentionlayerimprovestheperformanceofaneuralnetworkarchitectureforsynonymypredictionintheumlsmetathesaurus AT bodenreiderolivier addinganattentionlayerimprovestheperformanceofaneuralnetworkarchitectureforsynonymypredictionintheumlsmetathesaurus |