Cargando…

Distant Supervision for Extractive Question Summarization

Questions are often lengthy and difficult to understand because they tend to contain peripheral information. Previous work relies on costly human-annotated data or question-title pairs. In this work, we propose a distant supervision framework that can train a question summarizer without annotation c...

Descripción completa

Detalles Bibliográficos
Autores principales: Ishigaki, Tatsuya, Machida, Kazuya, Kobayashi, Hayato, Takamura, Hiroya, Okumura, Manabu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7148018/
http://dx.doi.org/10.1007/978-3-030-45442-5_23
_version_ 1783520512627965952
author Ishigaki, Tatsuya
Machida, Kazuya
Kobayashi, Hayato
Takamura, Hiroya
Okumura, Manabu
author_facet Ishigaki, Tatsuya
Machida, Kazuya
Kobayashi, Hayato
Takamura, Hiroya
Okumura, Manabu
author_sort Ishigaki, Tatsuya
collection PubMed
description Questions are often lengthy and difficult to understand because they tend to contain peripheral information. Previous work relies on costly human-annotated data or question-title pairs. In this work, we propose a distant supervision framework that can train a question summarizer without annotation costs or question-title pairs, where sentences are automatically annotated by means of heuristic rules. The key idea is that a single-sentence question tends to have a summary-like property. We empirically show that our models trained on the framework perform competitively with respect to supervised models without the requirement of a costly human-annotated dataset.
format Online
Article
Text
id pubmed-7148018
institution National Center for Biotechnology Information
language English
publishDate 2020
record_format MEDLINE/PubMed
spelling pubmed-71480182020-04-13 Distant Supervision for Extractive Question Summarization Ishigaki, Tatsuya Machida, Kazuya Kobayashi, Hayato Takamura, Hiroya Okumura, Manabu Advances in Information Retrieval Article Questions are often lengthy and difficult to understand because they tend to contain peripheral information. Previous work relies on costly human-annotated data or question-title pairs. In this work, we propose a distant supervision framework that can train a question summarizer without annotation costs or question-title pairs, where sentences are automatically annotated by means of heuristic rules. The key idea is that a single-sentence question tends to have a summary-like property. We empirically show that our models trained on the framework perform competitively with respect to supervised models without the requirement of a costly human-annotated dataset. 2020-03-24 /pmc/articles/PMC7148018/ http://dx.doi.org/10.1007/978-3-030-45442-5_23 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Ishigaki, Tatsuya
Machida, Kazuya
Kobayashi, Hayato
Takamura, Hiroya
Okumura, Manabu
Distant Supervision for Extractive Question Summarization
title Distant Supervision for Extractive Question Summarization
title_full Distant Supervision for Extractive Question Summarization
title_fullStr Distant Supervision for Extractive Question Summarization
title_full_unstemmed Distant Supervision for Extractive Question Summarization
title_short Distant Supervision for Extractive Question Summarization
title_sort distant supervision for extractive question summarization
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7148018/
http://dx.doi.org/10.1007/978-3-030-45442-5_23
work_keys_str_mv AT ishigakitatsuya distantsupervisionforextractivequestionsummarization
AT machidakazuya distantsupervisionforextractivequestionsummarization
AT kobayashihayato distantsupervisionforextractivequestionsummarization
AT takamurahiroya distantsupervisionforextractivequestionsummarization
AT okumuramanabu distantsupervisionforextractivequestionsummarization