Cargando…
To BERT or Not to BERT Dealing with Possible BERT Failures in an Entailment Task
In this paper we focus on an Natural Language Inference task. Being given two sentences, we classify their relation as NEUTRAL, ENTAILMENT or CONTRADICTION. Considering the achievements of BERT (Bidirectional Encoder Representations from Transformers) in many Natural Language Processing tasks, we us...
Autores principales: | Fialho, Pedro, Coheur, Luísa, Quaresma, Paulo |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7274325/ http://dx.doi.org/10.1007/978-3-030-50146-4_54 |
Ejemplares similares
-
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020) -
IUP-BERT: Identification of Umami Peptides Based on BERT Features
por: Jiang, Liangzhen, et al.
Publicado: (2022) -
Fusion-ConvBERT: Parallel Convolution and BERT Fusion for Speech Emotion Recognition
por: Lee, Sanghyun, et al.
Publicado: (2020) -
srBERT: automatic article classification model for systematic review using BERT
por: Aum, Sungmin, et al.
Publicado: (2021) -
Diagnosing BERT with Retrieval Heuristics
por: Câmara, Arthur, et al.
Publicado: (2020)