Cargando…

A Transformer-Based Hierarchical Variational AutoEncoder Combined Hidden Markov Model for Long Text Generation

The Variational AutoEncoder (VAE) has made significant progress in text generation, but it focused on short text (always a sentence). Long texts consist of multiple sentences. There is a particular relationship between each sentence, especially between the latent variables that control the generatio...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Kun, Ding, Hongwei, Ye, Kai, Cui, Xiaohui
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8534582/
https://www.ncbi.nlm.nih.gov/pubmed/34682001
http://dx.doi.org/10.3390/e23101277