Cargando…

MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication

The analysis of dynamic scenes has been a challenging domain in eye tracking research. This study presents a framework, named MAGiC, for analyzing gaze contact and gaze aversion in face-to-face communication. MAGiC provides an environment that is able to detect and track the conversation partner’s f...

Descripción completa

Detalles Bibliográficos
Autores principales: Arslan Aydın, Ülkü, Kalkan, Sinan, Acartürk, Cengiz
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Bern Open Publishing 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7906569/
https://www.ncbi.nlm.nih.gov/pubmed/33828712
http://dx.doi.org/10.16910/jemr.11.6.2
_version_ 1783655317160067072
author Arslan Aydın, Ülkü
Kalkan, Sinan
Acartürk, Cengiz
author_facet Arslan Aydın, Ülkü
Kalkan, Sinan
Acartürk, Cengiz
author_sort Arslan Aydın, Ülkü
collection PubMed
description The analysis of dynamic scenes has been a challenging domain in eye tracking research. This study presents a framework, named MAGiC, for analyzing gaze contact and gaze aversion in face-to-face communication. MAGiC provides an environment that is able to detect and track the conversation partner’s face automatically, overlay gaze data on top of the face video, and incorporate speech by means of speech-act annotation. Specifically, MAGiC integrates eye tracking data for gaze, audio data for speech segmentation, and video data for face tracking. MAGiC is an open source framework and its usage is demonstrated via publicly available video content and wiki pages. We explored the capabilities of MAGiC through a pilot study and showed that it facilitates the analysis of dynamic gaze data by reducing the annotation effort and the time spent for manual analysis of video data.
format Online
Article
Text
id pubmed-7906569
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Bern Open Publishing
record_format MEDLINE/PubMed
spelling pubmed-79065692021-04-06 MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication Arslan Aydın, Ülkü Kalkan, Sinan Acartürk, Cengiz J Eye Mov Res Research Article The analysis of dynamic scenes has been a challenging domain in eye tracking research. This study presents a framework, named MAGiC, for analyzing gaze contact and gaze aversion in face-to-face communication. MAGiC provides an environment that is able to detect and track the conversation partner’s face automatically, overlay gaze data on top of the face video, and incorporate speech by means of speech-act annotation. Specifically, MAGiC integrates eye tracking data for gaze, audio data for speech segmentation, and video data for face tracking. MAGiC is an open source framework and its usage is demonstrated via publicly available video content and wiki pages. We explored the capabilities of MAGiC through a pilot study and showed that it facilitates the analysis of dynamic gaze data by reducing the annotation effort and the time spent for manual analysis of video data. Bern Open Publishing 2018-11-12 /pmc/articles/PMC7906569/ /pubmed/33828712 http://dx.doi.org/10.16910/jemr.11.6.2 Text en This work is licensed under a Creative Commons Attribution 4.0 International License, ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited.
spellingShingle Research Article
Arslan Aydın, Ülkü
Kalkan, Sinan
Acartürk, Cengiz
MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title_full MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title_fullStr MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title_full_unstemmed MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title_short MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication
title_sort magic: a multimodal framework for analysing gaze in dyadic communication
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7906569/
https://www.ncbi.nlm.nih.gov/pubmed/33828712
http://dx.doi.org/10.16910/jemr.11.6.2
work_keys_str_mv AT arslanaydınulku magicamultimodalframeworkforanalysinggazeindyadiccommunication
AT kalkansinan magicamultimodalframeworkforanalysinggazeindyadiccommunication
AT acarturkcengiz magicamultimodalframeworkforanalysinggazeindyadiccommunication