Cargando…

AUBER: Automated BERT regularization

How can we effectively regularize BERT? Although BERT proves its effectiveness in various NLP tasks, it often overfits when there are only a small number of training instances. A promising direction to regularize BERT is based on pruning its attention heads with a proxy score for head importance. Ho...

Descripción completa

Detalles Bibliográficos
Autores principales: Lee, Hyun Dong, Lee, Seongmin, Kang, U.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8238198/
https://www.ncbi.nlm.nih.gov/pubmed/34181664
http://dx.doi.org/10.1371/journal.pone.0253241
_version_ 1783714852205756416
author Lee, Hyun Dong
Lee, Seongmin
Kang, U.
author_facet Lee, Hyun Dong
Lee, Seongmin
Kang, U.
author_sort Lee, Hyun Dong
collection PubMed
description How can we effectively regularize BERT? Although BERT proves its effectiveness in various NLP tasks, it often overfits when there are only a small number of training instances. A promising direction to regularize BERT is based on pruning its attention heads with a proxy score for head importance. However, these methods are usually suboptimal since they resort to arbitrarily determined numbers of attention heads to be pruned and do not directly aim for the performance enhancement. In order to overcome such a limitation, we propose AUBER, an automated BERT regularization method, that leverages reinforcement learning to automatically prune the proper attention heads from BERT. We also minimize the model complexity and the action search space by proposing a low-dimensional state representation and dually-greedy approach for training. Experimental results show that AUBER outperforms existing pruning methods by achieving up to 9.58% better performance. In addition, the ablation study demonstrates the effectiveness of design choices for AUBER.
format Online
Article
Text
id pubmed-8238198
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-82381982021-07-09 AUBER: Automated BERT regularization Lee, Hyun Dong Lee, Seongmin Kang, U. PLoS One Research Article How can we effectively regularize BERT? Although BERT proves its effectiveness in various NLP tasks, it often overfits when there are only a small number of training instances. A promising direction to regularize BERT is based on pruning its attention heads with a proxy score for head importance. However, these methods are usually suboptimal since they resort to arbitrarily determined numbers of attention heads to be pruned and do not directly aim for the performance enhancement. In order to overcome such a limitation, we propose AUBER, an automated BERT regularization method, that leverages reinforcement learning to automatically prune the proper attention heads from BERT. We also minimize the model complexity and the action search space by proposing a low-dimensional state representation and dually-greedy approach for training. Experimental results show that AUBER outperforms existing pruning methods by achieving up to 9.58% better performance. In addition, the ablation study demonstrates the effectiveness of design choices for AUBER. Public Library of Science 2021-06-28 /pmc/articles/PMC8238198/ /pubmed/34181664 http://dx.doi.org/10.1371/journal.pone.0253241 Text en © 2021 Lee et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Lee, Hyun Dong
Lee, Seongmin
Kang, U.
AUBER: Automated BERT regularization
title AUBER: Automated BERT regularization
title_full AUBER: Automated BERT regularization
title_fullStr AUBER: Automated BERT regularization
title_full_unstemmed AUBER: Automated BERT regularization
title_short AUBER: Automated BERT regularization
title_sort auber: automated bert regularization
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8238198/
https://www.ncbi.nlm.nih.gov/pubmed/34181664
http://dx.doi.org/10.1371/journal.pone.0253241
work_keys_str_mv AT leehyundong auberautomatedbertregularization
AT leeseongmin auberautomatedbertregularization
AT kangu auberautomatedbertregularization