Cargando…

Semantic Entropy in Language Comprehension

Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between...

Descripción completa

Detalles Bibliográficos
Autores principales: Venhuizen, Noortje J., Crocker, Matthew W., Brouwer, Harm
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514504/
http://dx.doi.org/10.3390/e21121159
_version_ 1783586603141169152
author Venhuizen, Noortje J.
Crocker, Matthew W.
Brouwer, Harm
author_facet Venhuizen, Noortje J.
Crocker, Matthew W.
Brouwer, Harm
author_sort Venhuizen, Noortje J.
collection PubMed
description Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between word processing difficulty and probabilistic language models. Extant models, however, are typically limited to capturing linguistic experience and hence cannot account for the influence of world knowledge. A recent comprehension model by Venhuizen, Crocker, and Brouwer (2019, Discourse Processes) improves upon this situation by instantiating a comprehension-centric metric of surprisal that integrates linguistic experience and world knowledge at the level of interpretation and combines them in determining online expectations. Here, we extend this work by deriving a comprehension-centric metric of entropy reduction from this model. In contrast to previous work, which has found that surprisal and entropy reduction are not easily dissociated, we do find a clear dissociation in our model. While both surprisal and entropy reduction derive from the same cognitive process—the word-by-word updating of the unfolding interpretation—they reflect different aspects of this process: state-by-state expectation (surprisal) versus end-state confirmation (entropy reduction).
format Online
Article
Text
id pubmed-7514504
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75145042020-11-09 Semantic Entropy in Language Comprehension Venhuizen, Noortje J. Crocker, Matthew W. Brouwer, Harm Entropy (Basel) Article Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between word processing difficulty and probabilistic language models. Extant models, however, are typically limited to capturing linguistic experience and hence cannot account for the influence of world knowledge. A recent comprehension model by Venhuizen, Crocker, and Brouwer (2019, Discourse Processes) improves upon this situation by instantiating a comprehension-centric metric of surprisal that integrates linguistic experience and world knowledge at the level of interpretation and combines them in determining online expectations. Here, we extend this work by deriving a comprehension-centric metric of entropy reduction from this model. In contrast to previous work, which has found that surprisal and entropy reduction are not easily dissociated, we do find a clear dissociation in our model. While both surprisal and entropy reduction derive from the same cognitive process—the word-by-word updating of the unfolding interpretation—they reflect different aspects of this process: state-by-state expectation (surprisal) versus end-state confirmation (entropy reduction). MDPI 2019-11-27 /pmc/articles/PMC7514504/ http://dx.doi.org/10.3390/e21121159 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Venhuizen, Noortje J.
Crocker, Matthew W.
Brouwer, Harm
Semantic Entropy in Language Comprehension
title Semantic Entropy in Language Comprehension
title_full Semantic Entropy in Language Comprehension
title_fullStr Semantic Entropy in Language Comprehension
title_full_unstemmed Semantic Entropy in Language Comprehension
title_short Semantic Entropy in Language Comprehension
title_sort semantic entropy in language comprehension
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514504/
http://dx.doi.org/10.3390/e21121159
work_keys_str_mv AT venhuizennoortjej semanticentropyinlanguagecomprehension
AT crockermattheww semanticentropyinlanguagecomprehension
AT brouwerharm semanticentropyinlanguagecomprehension