Cargando…
An Improved Transformer-Based Neural Machine Translation Strategy: Interacting-Head Attention
Transformer-based models have gained significant advances in neural machine translation (NMT). The main component of the transformer is the multihead attention layer. In theory, more heads enhance the expressive power of the NMT model. But this is not always the case in practice. On the one hand, th...
Autores principales: | Li, Dongxing, Luo, Zuying |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9239798/ https://www.ncbi.nlm.nih.gov/pubmed/35774445 http://dx.doi.org/10.1155/2022/2998242 |
Ejemplares similares
-
Beyond the Transformer: A Novel Polynomial Inherent Attention (PIA) Model and Its Great Impact on Neural Machine Translation
por: ELAffendi, Mohammed, et al.
Publicado: (2022) -
A Transformer-Based Neural Machine Translation Model for Arabic Dialects That Utilizes Subword Units
por: Baniata, Laith H., et al.
Publicado: (2021) -
Improving Neural Machine Translation for Low Resource Algerian Dialect by Transductive Transfer Learning Strategy
por: Slim, Amel, et al.
Publicado: (2022) -
Translating Akkadian to English with neural machine translation
por: Gutherz, Gai, et al.
Publicado: (2023) -
Improving Neural Machine Translation by Filtering Synthetic Parallel Data
por: Xu, Guanghao, et al.
Publicado: (2019)