Cargando…

Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion

The comprehension of spoken language is a crucial aspect of dialogue systems, encompassing two fundamental tasks: intent classification and slot filling. Currently, the joint modeling approach for these two tasks has emerged as the dominant method in spoken language understanding modeling. However,...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Yan, Luo, Zhenghang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10006958/
https://www.ncbi.nlm.nih.gov/pubmed/36905052
http://dx.doi.org/10.3390/s23052848
_version_ 1784905399641571328
author Chen, Yan
Luo, Zhenghang
author_facet Chen, Yan
Luo, Zhenghang
author_sort Chen, Yan
collection PubMed
description The comprehension of spoken language is a crucial aspect of dialogue systems, encompassing two fundamental tasks: intent classification and slot filling. Currently, the joint modeling approach for these two tasks has emerged as the dominant method in spoken language understanding modeling. However, the existing joint models have limitations in terms of their relevancy and utilization of contextual semantic features between the multiple tasks. To address these limitations, a joint model based on BERT and semantic fusion (JMBSF) is proposed. The model employs pre-trained BERT to extract semantic features and utilizes semantic fusion to associate and integrate this information. The results of experiments on two benchmark datasets, ATIS and Snips, in spoken language comprehension demonstrate that the proposed JMBSF model attains 98.80% and 99.71% intent classification accuracy, 98.25% and 97.24% slot-filling F1-score, and 93.40% and 93.57% sentence accuracy, respectively. These results reveal a significant improvement compared to other joint models. Furthermore, comprehensive ablation studies affirm the effectiveness of each component in the design of JMBSF.
format Online
Article
Text
id pubmed-10006958
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100069582023-03-12 Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion Chen, Yan Luo, Zhenghang Sensors (Basel) Article The comprehension of spoken language is a crucial aspect of dialogue systems, encompassing two fundamental tasks: intent classification and slot filling. Currently, the joint modeling approach for these two tasks has emerged as the dominant method in spoken language understanding modeling. However, the existing joint models have limitations in terms of their relevancy and utilization of contextual semantic features between the multiple tasks. To address these limitations, a joint model based on BERT and semantic fusion (JMBSF) is proposed. The model employs pre-trained BERT to extract semantic features and utilizes semantic fusion to associate and integrate this information. The results of experiments on two benchmark datasets, ATIS and Snips, in spoken language comprehension demonstrate that the proposed JMBSF model attains 98.80% and 99.71% intent classification accuracy, 98.25% and 97.24% slot-filling F1-score, and 93.40% and 93.57% sentence accuracy, respectively. These results reveal a significant improvement compared to other joint models. Furthermore, comprehensive ablation studies affirm the effectiveness of each component in the design of JMBSF. MDPI 2023-03-06 /pmc/articles/PMC10006958/ /pubmed/36905052 http://dx.doi.org/10.3390/s23052848 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Chen, Yan
Luo, Zhenghang
Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title_full Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title_fullStr Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title_full_unstemmed Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title_short Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
title_sort pre-trained joint model for intent classification and slot filling with semantic feature fusion
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10006958/
https://www.ncbi.nlm.nih.gov/pubmed/36905052
http://dx.doi.org/10.3390/s23052848
work_keys_str_mv AT chenyan pretrainedjointmodelforintentclassificationandslotfillingwithsemanticfeaturefusion
AT luozhenghang pretrainedjointmodelforintentclassificationandslotfillingwithsemanticfeaturefusion