Cargando…
Improving text mining in plant health domain with GAN and/or pre-trained language model
The Bidirectional Encoder Representations from Transformers (BERT) architecture offers a cutting-edge approach to Natural Language Processing. It involves two steps: 1) pre-training a language model to extract contextualized features and 2) fine-tuning for specific downstream tasks. Although pre-tra...
Autores principales: | Jiang, Shufan, Cormier, Stéphane, Angarita, Rafael, Rousseaux, Francis |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9989305/ https://www.ncbi.nlm.nih.gov/pubmed/36895200 http://dx.doi.org/10.3389/frai.2023.1072329 |
Ejemplares similares
-
Performance analysis of large language models in the domain of legal argument mining
por: Al Zubaer, Abdullah, et al.
Publicado: (2023) -
Probing language identity encoded in pre-trained multilingual models: a typological view
por: Zheng, Jianyu, et al.
Publicado: (2022) -
Online Brand Community User Segments: A Text Mining Approach
por: Ge, Ruichen, et al.
Publicado: (2022) -
Event classification from the Urdu language text on social media
por: Awan, Malik Daler Ali, et al.
Publicado: (2021) -
Effect of stemming on text similarity for Arabic language at sentence level
por: Alhawarat, Mohammad O., et al.
Publicado: (2021)