Cargando…

A multimodal dataset for various forms of distracted driving

We describe a multimodal dataset acquired in a controlled experiment on a driving simulator. The set includes data for n=68 volunteers that drove the same highway under four different conditions: No distraction, cognitive distraction, emotional distraction, and sensorimotor distraction. The experime...

Descripción completa

Detalles Bibliográficos
Autores principales: Taamneh, Salah, Tsiamyrtzis, Panagiotis, Dcosta, Malcolm, Buddharaju, Pradeep, Khatri, Ashik, Manser, Michael, Ferris, Thomas, Wunderlich, Robert, Pavlidis, Ioannis
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5827115/
https://www.ncbi.nlm.nih.gov/pubmed/28809848
http://dx.doi.org/10.1038/sdata.2017.110
_version_ 1783302432822919168
author Taamneh, Salah
Tsiamyrtzis, Panagiotis
Dcosta, Malcolm
Buddharaju, Pradeep
Khatri, Ashik
Manser, Michael
Ferris, Thomas
Wunderlich, Robert
Pavlidis, Ioannis
author_facet Taamneh, Salah
Tsiamyrtzis, Panagiotis
Dcosta, Malcolm
Buddharaju, Pradeep
Khatri, Ashik
Manser, Michael
Ferris, Thomas
Wunderlich, Robert
Pavlidis, Ioannis
author_sort Taamneh, Salah
collection PubMed
description We describe a multimodal dataset acquired in a controlled experiment on a driving simulator. The set includes data for n=68 volunteers that drove the same highway under four different conditions: No distraction, cognitive distraction, emotional distraction, and sensorimotor distraction. The experiment closed with a special driving session, where all subjects experienced a startle stimulus in the form of unintended acceleration—half of them under a mixed distraction, and the other half in the absence of a distraction. During the experimental drives key response variables and several explanatory variables were continuously recorded. The response variables included speed, acceleration, brake force, steering, and lane position signals, while the explanatory variables included perinasal electrodermal activity (EDA), palm EDA, heart rate, breathing rate, and facial expression signals; biographical and psychometric covariates as well as eye tracking data were also obtained. This dataset enables research into driving behaviors under neatly abstracted distracting stressors, which account for many car crashes. The set can also be used in physiological channel benchmarking and multispectral face recognition.
format Online
Article
Text
id pubmed-5827115
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Nature Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-58271152018-03-19 A multimodal dataset for various forms of distracted driving Taamneh, Salah Tsiamyrtzis, Panagiotis Dcosta, Malcolm Buddharaju, Pradeep Khatri, Ashik Manser, Michael Ferris, Thomas Wunderlich, Robert Pavlidis, Ioannis Sci Data Data Descriptor We describe a multimodal dataset acquired in a controlled experiment on a driving simulator. The set includes data for n=68 volunteers that drove the same highway under four different conditions: No distraction, cognitive distraction, emotional distraction, and sensorimotor distraction. The experiment closed with a special driving session, where all subjects experienced a startle stimulus in the form of unintended acceleration—half of them under a mixed distraction, and the other half in the absence of a distraction. During the experimental drives key response variables and several explanatory variables were continuously recorded. The response variables included speed, acceleration, brake force, steering, and lane position signals, while the explanatory variables included perinasal electrodermal activity (EDA), palm EDA, heart rate, breathing rate, and facial expression signals; biographical and psychometric covariates as well as eye tracking data were also obtained. This dataset enables research into driving behaviors under neatly abstracted distracting stressors, which account for many car crashes. The set can also be used in physiological channel benchmarking and multispectral face recognition. Nature Publishing Group 2017-08-15 /pmc/articles/PMC5827115/ /pubmed/28809848 http://dx.doi.org/10.1038/sdata.2017.110 Text en Copyright © 2017, The Author(s) http://creativecommons.org/licenses/by/4.0/ Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ The Creative Commons Public Domain Dedication waiver http://creativecommons.org/publicdomain/zero/1.0/ applies to the metadata files made available in this article.
spellingShingle Data Descriptor
Taamneh, Salah
Tsiamyrtzis, Panagiotis
Dcosta, Malcolm
Buddharaju, Pradeep
Khatri, Ashik
Manser, Michael
Ferris, Thomas
Wunderlich, Robert
Pavlidis, Ioannis
A multimodal dataset for various forms of distracted driving
title A multimodal dataset for various forms of distracted driving
title_full A multimodal dataset for various forms of distracted driving
title_fullStr A multimodal dataset for various forms of distracted driving
title_full_unstemmed A multimodal dataset for various forms of distracted driving
title_short A multimodal dataset for various forms of distracted driving
title_sort multimodal dataset for various forms of distracted driving
topic Data Descriptor
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5827115/
https://www.ncbi.nlm.nih.gov/pubmed/28809848
http://dx.doi.org/10.1038/sdata.2017.110
work_keys_str_mv AT taamnehsalah amultimodaldatasetforvariousformsofdistracteddriving
AT tsiamyrtzispanagiotis amultimodaldatasetforvariousformsofdistracteddriving
AT dcostamalcolm amultimodaldatasetforvariousformsofdistracteddriving
AT buddharajupradeep amultimodaldatasetforvariousformsofdistracteddriving
AT khatriashik amultimodaldatasetforvariousformsofdistracteddriving
AT mansermichael amultimodaldatasetforvariousformsofdistracteddriving
AT ferristhomas amultimodaldatasetforvariousformsofdistracteddriving
AT wunderlichrobert amultimodaldatasetforvariousformsofdistracteddriving
AT pavlidisioannis amultimodaldatasetforvariousformsofdistracteddriving
AT taamnehsalah multimodaldatasetforvariousformsofdistracteddriving
AT tsiamyrtzispanagiotis multimodaldatasetforvariousformsofdistracteddriving
AT dcostamalcolm multimodaldatasetforvariousformsofdistracteddriving
AT buddharajupradeep multimodaldatasetforvariousformsofdistracteddriving
AT khatriashik multimodaldatasetforvariousformsofdistracteddriving
AT mansermichael multimodaldatasetforvariousformsofdistracteddriving
AT ferristhomas multimodaldatasetforvariousformsofdistracteddriving
AT wunderlichrobert multimodaldatasetforvariousformsofdistracteddriving
AT pavlidisioannis multimodaldatasetforvariousformsofdistracteddriving