Cargando…

Entropy, Economics, and Criticality

Information theory is a well-established method for the study of many phenomena and more than 70 years after Claude Shannon first described it in A Mathematical Theory of Communication it has been extended well beyond Shannon’s initial vision. It is now an interdisciplinary tool that is used from ‘c...

Descripción completa

Detalles Bibliográficos
Autor principal: Harré, Michael S.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8871333/
https://www.ncbi.nlm.nih.gov/pubmed/35205504
http://dx.doi.org/10.3390/e24020210
_version_ 1784656971732877312
author Harré, Michael S.
author_facet Harré, Michael S.
author_sort Harré, Michael S.
collection PubMed
description Information theory is a well-established method for the study of many phenomena and more than 70 years after Claude Shannon first described it in A Mathematical Theory of Communication it has been extended well beyond Shannon’s initial vision. It is now an interdisciplinary tool that is used from ‘causal’ information flow to inferring complex computational processes and it is common to see it play an important role in fields as diverse as neuroscience, artificial intelligence, quantum mechanics, and astrophysics. In this article, I provide a selective review of a specific aspect of information theory that has received less attention than many of the others: as a tool for understanding, modelling, and detecting non-linear phenomena in finance and economics. Although some progress has been made in this area, it is still an under-developed area that I argue has considerable scope for further development.
format Online
Article
Text
id pubmed-8871333
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-88713332022-02-25 Entropy, Economics, and Criticality Harré, Michael S. Entropy (Basel) Opinion Information theory is a well-established method for the study of many phenomena and more than 70 years after Claude Shannon first described it in A Mathematical Theory of Communication it has been extended well beyond Shannon’s initial vision. It is now an interdisciplinary tool that is used from ‘causal’ information flow to inferring complex computational processes and it is common to see it play an important role in fields as diverse as neuroscience, artificial intelligence, quantum mechanics, and astrophysics. In this article, I provide a selective review of a specific aspect of information theory that has received less attention than many of the others: as a tool for understanding, modelling, and detecting non-linear phenomena in finance and economics. Although some progress has been made in this area, it is still an under-developed area that I argue has considerable scope for further development. MDPI 2022-01-28 /pmc/articles/PMC8871333/ /pubmed/35205504 http://dx.doi.org/10.3390/e24020210 Text en © 2022 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Opinion
Harré, Michael S.
Entropy, Economics, and Criticality
title Entropy, Economics, and Criticality
title_full Entropy, Economics, and Criticality
title_fullStr Entropy, Economics, and Criticality
title_full_unstemmed Entropy, Economics, and Criticality
title_short Entropy, Economics, and Criticality
title_sort entropy, economics, and criticality
topic Opinion
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8871333/
https://www.ncbi.nlm.nih.gov/pubmed/35205504
http://dx.doi.org/10.3390/e24020210
work_keys_str_mv AT harremichaels entropyeconomicsandcriticality