Cargando…
Modeling Emotional Valence Integration From Voice and Touch
In the context of designing multimodal social interactions for Human–Computer Interaction and for Computer–Mediated Communication, we conducted an experimental study to investigate how participants combine voice expressions with tactile stimulation to evaluate emotional valence (EV). In this study,...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6194168/ https://www.ncbi.nlm.nih.gov/pubmed/30369901 http://dx.doi.org/10.3389/fpsyg.2018.01966 |
_version_ | 1783364183340875776 |
---|---|
author | Tsalamlal, Yacine Amorim, Michel-Ange Martin, Jean-Claude Ammi, Mehdi |
author_facet | Tsalamlal, Yacine Amorim, Michel-Ange Martin, Jean-Claude Ammi, Mehdi |
author_sort | Tsalamlal, Yacine |
collection | PubMed |
description | In the context of designing multimodal social interactions for Human–Computer Interaction and for Computer–Mediated Communication, we conducted an experimental study to investigate how participants combine voice expressions with tactile stimulation to evaluate emotional valence (EV). In this study, audio and tactile stimuli were presented separately, and then presented together. Audio stimuli comprised positive and negative voice expressions, and tactile stimuli consisted of different levels of air jet tactile stimulation performed on the arm of the participants. Participants were asked to evaluate communicated EV on a continuous scale. Information Integration Theory was used to model multimodal valence perception process. Analyses showed that participants generally integrated both sources of information to evaluate EV. The main integration rule was averaging rule. The predominance of a modality over the other modality was specific to each individual. |
format | Online Article Text |
id | pubmed-6194168 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-61941682018-10-26 Modeling Emotional Valence Integration From Voice and Touch Tsalamlal, Yacine Amorim, Michel-Ange Martin, Jean-Claude Ammi, Mehdi Front Psychol Psychology In the context of designing multimodal social interactions for Human–Computer Interaction and for Computer–Mediated Communication, we conducted an experimental study to investigate how participants combine voice expressions with tactile stimulation to evaluate emotional valence (EV). In this study, audio and tactile stimuli were presented separately, and then presented together. Audio stimuli comprised positive and negative voice expressions, and tactile stimuli consisted of different levels of air jet tactile stimulation performed on the arm of the participants. Participants were asked to evaluate communicated EV on a continuous scale. Information Integration Theory was used to model multimodal valence perception process. Analyses showed that participants generally integrated both sources of information to evaluate EV. The main integration rule was averaging rule. The predominance of a modality over the other modality was specific to each individual. Frontiers Media S.A. 2018-10-12 /pmc/articles/PMC6194168/ /pubmed/30369901 http://dx.doi.org/10.3389/fpsyg.2018.01966 Text en Copyright © 2018 Tsalamlal, Amorim, Martin and Ammi. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Psychology Tsalamlal, Yacine Amorim, Michel-Ange Martin, Jean-Claude Ammi, Mehdi Modeling Emotional Valence Integration From Voice and Touch |
title | Modeling Emotional Valence Integration From Voice and Touch |
title_full | Modeling Emotional Valence Integration From Voice and Touch |
title_fullStr | Modeling Emotional Valence Integration From Voice and Touch |
title_full_unstemmed | Modeling Emotional Valence Integration From Voice and Touch |
title_short | Modeling Emotional Valence Integration From Voice and Touch |
title_sort | modeling emotional valence integration from voice and touch |
topic | Psychology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6194168/ https://www.ncbi.nlm.nih.gov/pubmed/30369901 http://dx.doi.org/10.3389/fpsyg.2018.01966 |
work_keys_str_mv | AT tsalamlalyacine modelingemotionalvalenceintegrationfromvoiceandtouch AT amorimmichelange modelingemotionalvalenceintegrationfromvoiceandtouch AT martinjeanclaude modelingemotionalvalenceintegrationfromvoiceandtouch AT ammimehdi modelingemotionalvalenceintegrationfromvoiceandtouch |