Cargando…

Improving text mining in plant health domain with GAN and/or pre-trained language model

The Bidirectional Encoder Representations from Transformers (BERT) architecture offers a cutting-edge approach to Natural Language Processing. It involves two steps: 1) pre-training a language model to extract contextualized features and 2) fine-tuning for specific downstream tasks. Although pre-tra...

Descripción completa

Detalles Bibliográficos
Autores principales: Jiang, Shufan, Cormier, Stéphane, Angarita, Rafael, Rousseaux, Francis
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9989305/
https://www.ncbi.nlm.nih.gov/pubmed/36895200
http://dx.doi.org/10.3389/frai.2023.1072329

Ejemplares similares