Cargando…

Approximate Entropy in Canonical and Non-Canonical Fiction

Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approxim...

Descripción completa

Detalles Bibliográficos
Autores principales: Mohseni, Mahdi, Redies, Christoph, Gast, Volker
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8870941/
https://www.ncbi.nlm.nih.gov/pubmed/35205572
http://dx.doi.org/10.3390/e24020278
_version_ 1784656877458554880
author Mohseni, Mahdi
Redies, Christoph
Gast, Volker
author_facet Mohseni, Mahdi
Redies, Christoph
Gast, Volker
author_sort Mohseni, Mahdi
collection PubMed
description Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics.
format Online
Article
Text
id pubmed-8870941
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-88709412022-02-25 Approximate Entropy in Canonical and Non-Canonical Fiction Mohseni, Mahdi Redies, Christoph Gast, Volker Entropy (Basel) Article Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics. MDPI 2022-02-15 /pmc/articles/PMC8870941/ /pubmed/35205572 http://dx.doi.org/10.3390/e24020278 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Mohseni, Mahdi
Redies, Christoph
Gast, Volker
Approximate Entropy in Canonical and Non-Canonical Fiction
title Approximate Entropy in Canonical and Non-Canonical Fiction
title_full Approximate Entropy in Canonical and Non-Canonical Fiction
title_fullStr Approximate Entropy in Canonical and Non-Canonical Fiction
title_full_unstemmed Approximate Entropy in Canonical and Non-Canonical Fiction
title_short Approximate Entropy in Canonical and Non-Canonical Fiction
title_sort approximate entropy in canonical and non-canonical fiction
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8870941/
https://www.ncbi.nlm.nih.gov/pubmed/35205572
http://dx.doi.org/10.3390/e24020278
work_keys_str_mv AT mohsenimahdi approximateentropyincanonicalandnoncanonicalfiction
AT redieschristoph approximateentropyincanonicalandnoncanonicalfiction
AT gastvolker approximateentropyincanonicalandnoncanonicalfiction