Cargando…

DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor

The emergence of various types of commercial cameras (compact, high resolution, high angle of view, high speed, and high dynamic range, etc.) has contributed significantly to the understanding of human activities. By taking advantage of the characteristic of a high angle of view, this paper demonstr...

Descripción completa

Detalles Bibliográficos
Autores principales: Watanabe, Ko, Soneda, Yusuke, Matsuda, Yuki, Nakamura, Yugo, Arakawa, Yutaka, Dengel, Andreas, Ishimaru, Shoya
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8434061/
https://www.ncbi.nlm.nih.gov/pubmed/34502609
http://dx.doi.org/10.3390/s21175719
_version_ 1783751509214756864
author Watanabe, Ko
Soneda, Yusuke
Matsuda, Yuki
Nakamura, Yugo
Arakawa, Yutaka
Dengel, Andreas
Ishimaru, Shoya
author_facet Watanabe, Ko
Soneda, Yusuke
Matsuda, Yuki
Nakamura, Yugo
Arakawa, Yutaka
Dengel, Andreas
Ishimaru, Shoya
author_sort Watanabe, Ko
collection PubMed
description The emergence of various types of commercial cameras (compact, high resolution, high angle of view, high speed, and high dynamic range, etc.) has contributed significantly to the understanding of human activities. By taking advantage of the characteristic of a high angle of view, this paper demonstrates a system that recognizes micro-behaviors and a small group discussion with a single 360 degree camera towards quantified meeting analysis. We propose a method that recognizes speaking and nodding, which have often been overlooked in existing research, from a video stream of face images and a random forest classifier. The proposed approach was evaluated on our three datasets. In order to create the first and the second datasets, we asked participants to meet physically: 16 sets of five minutes data from 21 unique participants and seven sets of 10 min meeting data from 12 unique participants. The experimental results showed that our approach could detect speaking and nodding with a macro average f1-score of 67.9% in a 10-fold random split cross-validation and a macro average f1-score of 62.5% in a leave-one-participant-out cross-validation. By considering the increased demand for an online meeting due to the COVID-19 pandemic, we also record faces on a screen that are captured by web cameras as the third dataset and discussed the potential and challenges of applying our ideas to virtual video conferences.
format Online
Article
Text
id pubmed-8434061
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-84340612021-09-12 DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor Watanabe, Ko Soneda, Yusuke Matsuda, Yuki Nakamura, Yugo Arakawa, Yutaka Dengel, Andreas Ishimaru, Shoya Sensors (Basel) Article The emergence of various types of commercial cameras (compact, high resolution, high angle of view, high speed, and high dynamic range, etc.) has contributed significantly to the understanding of human activities. By taking advantage of the characteristic of a high angle of view, this paper demonstrates a system that recognizes micro-behaviors and a small group discussion with a single 360 degree camera towards quantified meeting analysis. We propose a method that recognizes speaking and nodding, which have often been overlooked in existing research, from a video stream of face images and a random forest classifier. The proposed approach was evaluated on our three datasets. In order to create the first and the second datasets, we asked participants to meet physically: 16 sets of five minutes data from 21 unique participants and seven sets of 10 min meeting data from 12 unique participants. The experimental results showed that our approach could detect speaking and nodding with a macro average f1-score of 67.9% in a 10-fold random split cross-validation and a macro average f1-score of 62.5% in a leave-one-participant-out cross-validation. By considering the increased demand for an online meeting due to the COVID-19 pandemic, we also record faces on a screen that are captured by web cameras as the third dataset and discussed the potential and challenges of applying our ideas to virtual video conferences. MDPI 2021-08-25 /pmc/articles/PMC8434061/ /pubmed/34502609 http://dx.doi.org/10.3390/s21175719 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Watanabe, Ko
Soneda, Yusuke
Matsuda, Yuki
Nakamura, Yugo
Arakawa, Yutaka
Dengel, Andreas
Ishimaru, Shoya
DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title_full DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title_fullStr DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title_full_unstemmed DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title_short DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor
title_sort discaas: micro behavior analysis on discussion by camera as a sensor
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8434061/
https://www.ncbi.nlm.nih.gov/pubmed/34502609
http://dx.doi.org/10.3390/s21175719
work_keys_str_mv AT watanabeko discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT sonedayusuke discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT matsudayuki discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT nakamurayugo discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT arakawayutaka discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT dengelandreas discaasmicrobehavioranalysisondiscussionbycameraasasensor
AT ishimarushoya discaasmicrobehavioranalysisondiscussionbycameraasasensor