Cargando…

Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli

Quantifying the intensity of animals’ reaction to stimuli is notoriously difficult as classic unidimensional measures of responses such as latency or duration of looking can fail to capture the overall strength of behavioural responses. More holistic rating can be useful but have the inherent risks...

Descripción completa

Detalles Bibliográficos
Autores principales: Root-Gutteridge, Holly, Brown, Louise P., Forman, Jemma, Korzeniowska, Anna T., Simner, Julia, Reby, David
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Berlin Heidelberg 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8360862/
https://www.ncbi.nlm.nih.gov/pubmed/33751273
http://dx.doi.org/10.1007/s10071-021-01490-8
_version_ 1783737833186394112
author Root-Gutteridge, Holly
Brown, Louise P.
Forman, Jemma
Korzeniowska, Anna T.
Simner, Julia
Reby, David
author_facet Root-Gutteridge, Holly
Brown, Louise P.
Forman, Jemma
Korzeniowska, Anna T.
Simner, Julia
Reby, David
author_sort Root-Gutteridge, Holly
collection PubMed
description Quantifying the intensity of animals’ reaction to stimuli is notoriously difficult as classic unidimensional measures of responses such as latency or duration of looking can fail to capture the overall strength of behavioural responses. More holistic rating can be useful but have the inherent risks of subjective bias and lack of repeatability. Here, we explored whether crowdsourcing could be used to efficiently and reliably overcome these potential flaws. A total of 396 participants watched online videos of dogs reacting to auditory stimuli and provided 23,248 ratings of the strength of the dogs’ responses from zero (default) to 100 using an online survey form. We found that raters achieved very high inter-rater reliability across multiple datasets (although their responses were affected by their sex, age, and attitude towards animals) and that as few as 10 raters could be used to achieve a reliable result. A linear mixed model applied to PCA components of behaviours discovered that the dogs’ facial expressions and head orientation influenced the strength of behaviour ratings the most. Further linear mixed models showed that that strength of behaviour ratings was moderately correlated to the duration of dogs’ reactions but not to dogs’ reaction latency (from the stimulus onset). This suggests that observers’ ratings captured consistent dimensions of animals’ responses that are not fully represented by more classic unidimensional metrics. Finally, we report that overall participants strongly enjoyed the experience. Thus, we suggest that using crowdsourcing can offer a useful, repeatable tool to assess behavioural intensity in experimental or observational studies where unidimensional coding may miss nuance, or where coding multiple dimensions may be too time-consuming. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10071-021-01490-8.
format Online
Article
Text
id pubmed-8360862
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Springer Berlin Heidelberg
record_format MEDLINE/PubMed
spelling pubmed-83608622021-08-30 Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli Root-Gutteridge, Holly Brown, Louise P. Forman, Jemma Korzeniowska, Anna T. Simner, Julia Reby, David Anim Cogn Methods Paper Quantifying the intensity of animals’ reaction to stimuli is notoriously difficult as classic unidimensional measures of responses such as latency or duration of looking can fail to capture the overall strength of behavioural responses. More holistic rating can be useful but have the inherent risks of subjective bias and lack of repeatability. Here, we explored whether crowdsourcing could be used to efficiently and reliably overcome these potential flaws. A total of 396 participants watched online videos of dogs reacting to auditory stimuli and provided 23,248 ratings of the strength of the dogs’ responses from zero (default) to 100 using an online survey form. We found that raters achieved very high inter-rater reliability across multiple datasets (although their responses were affected by their sex, age, and attitude towards animals) and that as few as 10 raters could be used to achieve a reliable result. A linear mixed model applied to PCA components of behaviours discovered that the dogs’ facial expressions and head orientation influenced the strength of behaviour ratings the most. Further linear mixed models showed that that strength of behaviour ratings was moderately correlated to the duration of dogs’ reactions but not to dogs’ reaction latency (from the stimulus onset). This suggests that observers’ ratings captured consistent dimensions of animals’ responses that are not fully represented by more classic unidimensional metrics. Finally, we report that overall participants strongly enjoyed the experience. Thus, we suggest that using crowdsourcing can offer a useful, repeatable tool to assess behavioural intensity in experimental or observational studies where unidimensional coding may miss nuance, or where coding multiple dimensions may be too time-consuming. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10071-021-01490-8. Springer Berlin Heidelberg 2021-03-09 2021 /pmc/articles/PMC8360862/ /pubmed/33751273 http://dx.doi.org/10.1007/s10071-021-01490-8 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Methods Paper
Root-Gutteridge, Holly
Brown, Louise P.
Forman, Jemma
Korzeniowska, Anna T.
Simner, Julia
Reby, David
Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title_full Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title_fullStr Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title_full_unstemmed Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title_short Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
title_sort using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli
topic Methods Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8360862/
https://www.ncbi.nlm.nih.gov/pubmed/33751273
http://dx.doi.org/10.1007/s10071-021-01490-8
work_keys_str_mv AT rootgutteridgeholly usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli
AT brownlouisep usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli
AT formanjemma usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli
AT korzeniowskaannat usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli
AT simnerjulia usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli
AT rebydavid usinganewvideoratingtooltocrowdsourceanalysisofbehaviouralreactiontostimuli