Cargando…
Beyond the Transformer: A Novel Polynomial Inherent Attention (PIA) Model and Its Great Impact on Neural Machine Translation
This paper describes a novel polynomial inherent attention (PIA) model that outperforms all state-of-the-art transformer models on neural machine translation (NMT) by a wide margin. PIA is based on the simple idea that natural language sentences can be transformed into a special type of binary atten...
Autores principales: | ELAffendi, Mohammed, Alrajhi, Khawlah |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9519290/ https://www.ncbi.nlm.nih.gov/pubmed/36188704 http://dx.doi.org/10.1155/2022/1912750 |
Ejemplares similares
-
An Improved Transformer-Based Neural Machine Translation Strategy: Interacting-Head Attention
por: Li, Dongxing, et al.
Publicado: (2022) -
The Translated Dowling Polynomials and Numbers
por: Mangontarum, Mahid M., et al.
Publicado: (2014) -
Polynomial, Neural Network, and Spline Wavelet Models for Continuous Wavelet Transform of Signals
por: Stepanov, Andrey
Publicado: (2021) -
Multiple Hilbert transforms associated with polynomials
por: Kim, Joonil
Publicado: (2015) -
D-transformation and polynomial track recognition
por: Nazarenko, M A, et al.
Publicado: (1996)