Cargando…

Modeling the effects of perisaccadic attention on gaze statistics during scene viewing

How we perceive a visual scene depends critically on the selection of gaze positions. For this selection process, visual attention is known to play a key role in two ways. First, image-features attract visual attention, a fact that is captured well by time-independent fixation models. Second, millis...

Descripción completa

Detalles Bibliográficos
Autores principales: Schwetlick, Lisa, Rothkegel, Lars Oliver Martin, Trukenbrod, Hans Arne, Engbert, Ralf
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7708631/
https://www.ncbi.nlm.nih.gov/pubmed/33262536
http://dx.doi.org/10.1038/s42003-020-01429-8
_version_ 1783617578856349696
author Schwetlick, Lisa
Rothkegel, Lars Oliver Martin
Trukenbrod, Hans Arne
Engbert, Ralf
author_facet Schwetlick, Lisa
Rothkegel, Lars Oliver Martin
Trukenbrod, Hans Arne
Engbert, Ralf
author_sort Schwetlick, Lisa
collection PubMed
description How we perceive a visual scene depends critically on the selection of gaze positions. For this selection process, visual attention is known to play a key role in two ways. First, image-features attract visual attention, a fact that is captured well by time-independent fixation models. Second, millisecond-level attentional dynamics around the time of saccade drives our gaze from one position to the next. These two related research areas on attention are typically perceived as separate, both theoretically and experimentally. Here we link the two research areas by demonstrating that perisaccadic attentional dynamics improve predictions on scan path statistics. In a mathematical model, we integrated perisaccadic covert attention with dynamic scan path generation. Our model reproduces saccade amplitude distributions, angular statistics, intersaccadic turning angles, and their impact on fixation durations as well as inter-individual differences using Bayesian inference. Therefore, our result lend support to the relevance of perisaccadic attention to gaze statistics.
format Online
Article
Text
id pubmed-7708631
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-77086312020-12-03 Modeling the effects of perisaccadic attention on gaze statistics during scene viewing Schwetlick, Lisa Rothkegel, Lars Oliver Martin Trukenbrod, Hans Arne Engbert, Ralf Commun Biol Article How we perceive a visual scene depends critically on the selection of gaze positions. For this selection process, visual attention is known to play a key role in two ways. First, image-features attract visual attention, a fact that is captured well by time-independent fixation models. Second, millisecond-level attentional dynamics around the time of saccade drives our gaze from one position to the next. These two related research areas on attention are typically perceived as separate, both theoretically and experimentally. Here we link the two research areas by demonstrating that perisaccadic attentional dynamics improve predictions on scan path statistics. In a mathematical model, we integrated perisaccadic covert attention with dynamic scan path generation. Our model reproduces saccade amplitude distributions, angular statistics, intersaccadic turning angles, and their impact on fixation durations as well as inter-individual differences using Bayesian inference. Therefore, our result lend support to the relevance of perisaccadic attention to gaze statistics. Nature Publishing Group UK 2020-12-01 /pmc/articles/PMC7708631/ /pubmed/33262536 http://dx.doi.org/10.1038/s42003-020-01429-8 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Schwetlick, Lisa
Rothkegel, Lars Oliver Martin
Trukenbrod, Hans Arne
Engbert, Ralf
Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title_full Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title_fullStr Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title_full_unstemmed Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title_short Modeling the effects of perisaccadic attention on gaze statistics during scene viewing
title_sort modeling the effects of perisaccadic attention on gaze statistics during scene viewing
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7708631/
https://www.ncbi.nlm.nih.gov/pubmed/33262536
http://dx.doi.org/10.1038/s42003-020-01429-8
work_keys_str_mv AT schwetlicklisa modelingtheeffectsofperisaccadicattentionongazestatisticsduringsceneviewing
AT rothkegellarsolivermartin modelingtheeffectsofperisaccadicattentionongazestatisticsduringsceneviewing
AT trukenbrodhansarne modelingtheeffectsofperisaccadicattentionongazestatisticsduringsceneviewing
AT engbertralf modelingtheeffectsofperisaccadicattentionongazestatisticsduringsceneviewing