Cargando…
An Improved Math Word Problem (MWP) Model Using Unified Pretrained Language Model (UniLM) for Pretraining
Natural Language Understanding (NLU) and Natural Language Generation (NLG) are the general methods that support machine understanding of text content. They play a very important role in the text information processing system including recommendation and question and answer systems. There are many re...
Autores principales: | Zhang, Dongqiu, Li, Wenkui |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9303081/ https://www.ncbi.nlm.nih.gov/pubmed/35875782 http://dx.doi.org/10.1155/2022/7468286 |
Ejemplares similares
-
Pretrained Transformer Language Models Versus Pretrained Word Embeddings for the Detection of Accurate Health Information on Arabic Social Media: Comparative Study
por: Albalawi, Yahya, et al.
Publicado: (2022) -
To pretrain or not? A systematic analysis of the benefits of pretraining in diabetic retinopathy
por: Srinivasan, Vignesh, et al.
Publicado: (2022) -
Document Network Projection in Pretrained Word Embedding Space
por: Gourru, Antoine, et al.
Publicado: (2020) -
BatteryBERT: A Pretrained Language Model for Battery
Database Enhancement
por: Huang, Shu, et al.
Publicado: (2022) -
Chemical–protein relation extraction with ensembles of carefully tuned pretrained language models
por: Weber, Leon, et al.
Publicado: (2022)