Cargando…
Heavyweight Statistical Alignment to Guide Neural Translation
Transformer neural models with multihead attentions outperform all existing translation models. Nevertheless, some features of traditional statistical models, such as prior alignment between source and target words, prove useful in training the state-of-the-art Transformer models. It has been report...
Autores principales: | Nguyen, Thien, Nguyen, Trang |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9187440/ https://www.ncbi.nlm.nih.gov/pubmed/35694597 http://dx.doi.org/10.1155/2022/6856567 |
Ejemplares similares
-
Fire Performance of Heavyweight Self-Compacting Concrete and Heavyweight High Strength Concrete
por: Aslani, Farhad, et al.
Publicado: (2019) -
Tau neutrino no heavyweight?
Publicado: (1991) -
Development of Heavyweight Self-Compacting Concrete and Ambient-Cured Heavyweight Geopolymer Concrete Using Magnetite Aggregates
por: Valizadeh, Afsaneh, et al.
Publicado: (2019) -
Mixed-Level Neural Machine Translation
por: Nguyen, Thien, et al.
Publicado: (2020) -
Reflections on hexavalent chromium: health hazards of an industrial heavyweight.
por: Pellerin, C, et al.
Publicado: (2000)