Cargando…
AUBER: Automated BERT regularization
How can we effectively regularize BERT? Although BERT proves its effectiveness in various NLP tasks, it often overfits when there are only a small number of training instances. A promising direction to regularize BERT is based on pruning its attention heads with a proxy score for head importance. Ho...
Autores principales: | Lee, Hyun Dong, Lee, Seongmin, Kang, U. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8238198/ https://www.ncbi.nlm.nih.gov/pubmed/34181664 http://dx.doi.org/10.1371/journal.pone.0253241 |
Ejemplares similares
-
To BERT or Not to BERT Dealing with Possible BERT Failures in an Entailment Task
por: Fialho, Pedro, et al.
Publicado: (2020) -
Fusion-ConvBERT: Parallel Convolution and BERT Fusion for Speech Emotion Recognition
por: Lee, Sanghyun, et al.
Publicado: (2020) -
Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT
por: Cho, Ikhyun, et al.
Publicado: (2022) -
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020) -
IUP-BERT: Identification of Umami Peptides Based on BERT Features
por: Jiang, Liangzhen, et al.
Publicado: (2022)