Cargando…
Natural Language Generation Using Transformer Network in an Open-Domain Setting
Prior works on dialog generation focus on task-oriented setting and utilize multi-turn conversational utterance-response pairs. However, natural language generation (NLG) in the open-domain environment is more challenging. The conversations in an open-domain chit-chat model are mostly single-turn in...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7298179/ http://dx.doi.org/10.1007/978-3-030-51310-8_8 |
_version_ | 1783547163317370880 |
---|---|
author | Varshney, Deeksha Ekbal, Asif Nagaraja, Ganesh Prasad Tiwari, Mrigank Gopinath, Abhijith Athreya Mysore Bhattacharyya, Pushpak |
author_facet | Varshney, Deeksha Ekbal, Asif Nagaraja, Ganesh Prasad Tiwari, Mrigank Gopinath, Abhijith Athreya Mysore Bhattacharyya, Pushpak |
author_sort | Varshney, Deeksha |
collection | PubMed |
description | Prior works on dialog generation focus on task-oriented setting and utilize multi-turn conversational utterance-response pairs. However, natural language generation (NLG) in the open-domain environment is more challenging. The conversations in an open-domain chit-chat model are mostly single-turn in nature. Current methods used for modeling single-turn conversations often fail to generate contextually relevant responses for a large dataset. In our work, we develop a transformer-based method for natural language generation (NLG) in an open-domain setting. Experiments on the utterance-response pairs show improvement over the baselines, both in terms of quantitative measures like BLEU and ROUGE and human evaluation metrics like fluency and adequacy. |
format | Online Article Text |
id | pubmed-7298179 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-72981792020-06-17 Natural Language Generation Using Transformer Network in an Open-Domain Setting Varshney, Deeksha Ekbal, Asif Nagaraja, Ganesh Prasad Tiwari, Mrigank Gopinath, Abhijith Athreya Mysore Bhattacharyya, Pushpak Natural Language Processing and Information Systems Article Prior works on dialog generation focus on task-oriented setting and utilize multi-turn conversational utterance-response pairs. However, natural language generation (NLG) in the open-domain environment is more challenging. The conversations in an open-domain chit-chat model are mostly single-turn in nature. Current methods used for modeling single-turn conversations often fail to generate contextually relevant responses for a large dataset. In our work, we develop a transformer-based method for natural language generation (NLG) in an open-domain setting. Experiments on the utterance-response pairs show improvement over the baselines, both in terms of quantitative measures like BLEU and ROUGE and human evaluation metrics like fluency and adequacy. 2020-05-26 /pmc/articles/PMC7298179/ http://dx.doi.org/10.1007/978-3-030-51310-8_8 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Varshney, Deeksha Ekbal, Asif Nagaraja, Ganesh Prasad Tiwari, Mrigank Gopinath, Abhijith Athreya Mysore Bhattacharyya, Pushpak Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title | Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title_full | Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title_fullStr | Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title_full_unstemmed | Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title_short | Natural Language Generation Using Transformer Network in an Open-Domain Setting |
title_sort | natural language generation using transformer network in an open-domain setting |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7298179/ http://dx.doi.org/10.1007/978-3-030-51310-8_8 |
work_keys_str_mv | AT varshneydeeksha naturallanguagegenerationusingtransformernetworkinanopendomainsetting AT ekbalasif naturallanguagegenerationusingtransformernetworkinanopendomainsetting AT nagarajaganeshprasad naturallanguagegenerationusingtransformernetworkinanopendomainsetting AT tiwarimrigank naturallanguagegenerationusingtransformernetworkinanopendomainsetting AT gopinathabhijithathreyamysore naturallanguagegenerationusingtransformernetworkinanopendomainsetting AT bhattacharyyapushpak naturallanguagegenerationusingtransformernetworkinanopendomainsetting |