Cargando…

Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers

Visual reasoning is critical in many complex visual tasks in medicine such as radiology or pathology. It is challenging to explicitly explain reasoning processes due to the dynamic nature of real-time human cognition. A deeper understanding of such reasoning processes is necessary for improving diag...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Yu, Cao, Hongfei, Allen, Carla M., Wang, Xin, Erdelez, Sanda, Shyu, Chi-Ren
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7730148/
https://www.ncbi.nlm.nih.gov/pubmed/33303770
http://dx.doi.org/10.1038/s41598-020-77550-9
_version_ 1783621616899457024
author Li, Yu
Cao, Hongfei
Allen, Carla M.
Wang, Xin
Erdelez, Sanda
Shyu, Chi-Ren
author_facet Li, Yu
Cao, Hongfei
Allen, Carla M.
Wang, Xin
Erdelez, Sanda
Shyu, Chi-Ren
author_sort Li, Yu
collection PubMed
description Visual reasoning is critical in many complex visual tasks in medicine such as radiology or pathology. It is challenging to explicitly explain reasoning processes due to the dynamic nature of real-time human cognition. A deeper understanding of such reasoning processes is necessary for improving diagnostic accuracy and computational tools. Most computational analysis methods for visual attention utilize black-box algorithms which lack explainability and are therefore limited in understanding the visual reasoning processes. In this paper, we propose a computational method to quantify and dissect visual reasoning. The method characterizes spatial and temporal features and identifies common and contrast visual reasoning patterns to extract significant gaze activities. The visual reasoning patterns are explainable and can be compared among different groups to discover strategy differences. Experiments with radiographers of varied levels of expertise on 10 levels of visual tasks were conducted. Our empirical observations show that the method can capture the temporal and spatial features of human visual attention and distinguish expertise level. The extracted patterns are further examined and interpreted to showcase key differences between expertise levels in the visual reasoning processes. By revealing task-related reasoning processes, this method demonstrates potential for explaining human visual understanding.
format Online
Article
Text
id pubmed-7730148
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-77301482020-12-14 Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers Li, Yu Cao, Hongfei Allen, Carla M. Wang, Xin Erdelez, Sanda Shyu, Chi-Ren Sci Rep Article Visual reasoning is critical in many complex visual tasks in medicine such as radiology or pathology. It is challenging to explicitly explain reasoning processes due to the dynamic nature of real-time human cognition. A deeper understanding of such reasoning processes is necessary for improving diagnostic accuracy and computational tools. Most computational analysis methods for visual attention utilize black-box algorithms which lack explainability and are therefore limited in understanding the visual reasoning processes. In this paper, we propose a computational method to quantify and dissect visual reasoning. The method characterizes spatial and temporal features and identifies common and contrast visual reasoning patterns to extract significant gaze activities. The visual reasoning patterns are explainable and can be compared among different groups to discover strategy differences. Experiments with radiographers of varied levels of expertise on 10 levels of visual tasks were conducted. Our empirical observations show that the method can capture the temporal and spatial features of human visual attention and distinguish expertise level. The extracted patterns are further examined and interpreted to showcase key differences between expertise levels in the visual reasoning processes. By revealing task-related reasoning processes, this method demonstrates potential for explaining human visual understanding. Nature Publishing Group UK 2020-12-10 /pmc/articles/PMC7730148/ /pubmed/33303770 http://dx.doi.org/10.1038/s41598-020-77550-9 Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Li, Yu
Cao, Hongfei
Allen, Carla M.
Wang, Xin
Erdelez, Sanda
Shyu, Chi-Ren
Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title_full Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title_fullStr Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title_full_unstemmed Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title_short Computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
title_sort computational modeling of human reasoning processes for interpretable visual knowledge: a case study with radiographers
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7730148/
https://www.ncbi.nlm.nih.gov/pubmed/33303770
http://dx.doi.org/10.1038/s41598-020-77550-9
work_keys_str_mv AT liyu computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers
AT caohongfei computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers
AT allencarlam computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers
AT wangxin computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers
AT erdelezsanda computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers
AT shyuchiren computationalmodelingofhumanreasoningprocessesforinterpretablevisualknowledgeacasestudywithradiographers