Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli

Root-Gutteridge, Holly, Brown, Louise P, Forman, Jemma, Korzeniowska, Anna T, Simner, Julia and Reby, David (2021) Using a new video rating tool to crowd-source analysis of behavioural reaction to stimuli. Animal Cognition. ISSN 1435-9448

[img] PDF - Accepted Version
Download (345kB)
[img] PDF - Published Version
Available under License Creative Commons Attribution.

Download (922kB)

Abstract

Quantifying the intensity of animals’ reaction to stimuli is notoriously difficult as classic unidimensional measures of responses such as latency or duration of looking can fail to capture the overall strength of behavioural responses. More holistic rating can be useful but have the inherent risks of subjective bias and lack of repeatability. Here, we explored whether crowdsourcing could be used to efficiently and reliably overcome these potential flaws. A total of 396 participants watched online videos of dogs reacting to auditory stimuli and provided 23,248 ratings of the strength of the dogs’ responses from zero (default) to 100 using an online survey form. We found that raters achieved very high inter-rater reliability across multiple datasets (although their responses were affected by their sex, age, and attitude towards animals) and that as few as 10 raters could be used to achieve a reliable result. A linear mixed model applied to PCA components of behaviours discovered that the dogs’ facial expressions and head orientation influenced the strength of behaviour ratings the most. Further linear mixed models showed that that strength of behaviour ratings was moderately correlated to the duration of dogs’ reactions but not to dogs’ reaction latency (from the stimulus onset). This suggests that observers’ ratings captured consistent dimensions of animals’ responses that are not fully represented by more classic unidimensional metrics. Finally, we report that overall participants strongly enjoyed the experience. Thus, we suggest that using crowdsourcing can offer a useful, repeatable tool to assess behavioural intensity in experimental or observational studies where unidimensional coding may miss nuance, or where coding multiple dimensions may be too time-consuming.

Item Type: Article
Schools and Departments: School of Psychology > Psychology
SWORD Depositor: Mx Elements Account
Depositing User: Mx Elements Account
Date Deposited: 25 Feb 2021 13:35
Last Modified: 11 Mar 2021 16:30
URI: http://sro.sussex.ac.uk/id/eprint/97385

View download statistics for this item

📧 Request an update