Cargando…

Does Machine Understanding Require Consciousness?

This article addresses the question of whether machine understanding requires consciousness. Some researchers in the field of machine understanding have argued that it is not necessary for computers to be conscious as long as they can match or exceed human performance in certain tasks. But despite t...

Descripción completa

Detalles Bibliográficos
Autor principal: Pepperell, Robert
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9159796/
https://www.ncbi.nlm.nih.gov/pubmed/35664685
http://dx.doi.org/10.3389/fnsys.2022.788486
_version_ 1784719132894167040
author Pepperell, Robert
author_facet Pepperell, Robert
author_sort Pepperell, Robert
collection PubMed
description This article addresses the question of whether machine understanding requires consciousness. Some researchers in the field of machine understanding have argued that it is not necessary for computers to be conscious as long as they can match or exceed human performance in certain tasks. But despite the remarkable recent success of machine learning systems in areas such as natural language processing and image classification, important questions remain about their limited performance and about whether their cognitive abilities entail genuine understanding or are the product of spurious correlations. Here I draw a distinction between natural, artificial, and machine understanding. I analyse some concrete examples of natural understanding and show that although it shares properties with the artificial understanding implemented in current machine learning systems it also has some essential differences, the main one being that natural understanding in humans entails consciousness. Moreover, evidence from psychology and neurobiology suggests that it is this capacity for consciousness that, in part at least, explains for the superior performance of humans in some cognitive tasks and may also account for the authenticity of semantic processing that seems to be the hallmark of natural understanding. I propose a hypothesis that might help to explain why consciousness is important to understanding. In closing, I suggest that progress toward implementing human-like understanding in machines—machine understanding—may benefit from a naturalistic approach in which natural processes are modelled as closely as possible in mechanical substrates.
format Online
Article
Text
id pubmed-9159796
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-91597962022-06-02 Does Machine Understanding Require Consciousness? Pepperell, Robert Front Syst Neurosci Neuroscience This article addresses the question of whether machine understanding requires consciousness. Some researchers in the field of machine understanding have argued that it is not necessary for computers to be conscious as long as they can match or exceed human performance in certain tasks. But despite the remarkable recent success of machine learning systems in areas such as natural language processing and image classification, important questions remain about their limited performance and about whether their cognitive abilities entail genuine understanding or are the product of spurious correlations. Here I draw a distinction between natural, artificial, and machine understanding. I analyse some concrete examples of natural understanding and show that although it shares properties with the artificial understanding implemented in current machine learning systems it also has some essential differences, the main one being that natural understanding in humans entails consciousness. Moreover, evidence from psychology and neurobiology suggests that it is this capacity for consciousness that, in part at least, explains for the superior performance of humans in some cognitive tasks and may also account for the authenticity of semantic processing that seems to be the hallmark of natural understanding. I propose a hypothesis that might help to explain why consciousness is important to understanding. In closing, I suggest that progress toward implementing human-like understanding in machines—machine understanding—may benefit from a naturalistic approach in which natural processes are modelled as closely as possible in mechanical substrates. Frontiers Media S.A. 2022-05-18 /pmc/articles/PMC9159796/ /pubmed/35664685 http://dx.doi.org/10.3389/fnsys.2022.788486 Text en Copyright © 2022 Pepperell. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Pepperell, Robert
Does Machine Understanding Require Consciousness?
title Does Machine Understanding Require Consciousness?
title_full Does Machine Understanding Require Consciousness?
title_fullStr Does Machine Understanding Require Consciousness?
title_full_unstemmed Does Machine Understanding Require Consciousness?
title_short Does Machine Understanding Require Consciousness?
title_sort does machine understanding require consciousness?
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9159796/
https://www.ncbi.nlm.nih.gov/pubmed/35664685
http://dx.doi.org/10.3389/fnsys.2022.788486
work_keys_str_mv AT pepperellrobert doesmachineunderstandingrequireconsciousness