Nonlinguistic vocalizations from online amateur videos for emotion research: A validated corpus

Research output: Contribution to journalArticle

Abstract

This study introduces a corpus of 260 naturalistic human nonlinguistic vocalizations representing nine emotions: amusement, anger, disgust, effort, fear, joy, pain, pleasure, and sadness. The recognition accuracy in a rating task varied greatly per emotion, from <40% for joy and pain, to >70% for amusement, pleasure, fear, and sadness. In contrast, the raters’ linguistic–cultural group had no effect on recognition accuracy: The predominantly English-language corpus was classified with similar accuracies by participants from Brazil, Russia, Sweden, and the UK/USA. Supervised random forest models classified the sounds as accurately as the human raters. The best acoustic predictors of emotion were pitch, harmonicity, and the spacing and regularity of syllables. This corpus of ecologically valid emotional vocalizations can be filtered to include only sounds with high recognition rates, in order to study reactions to emotional stimuli of known perceptual types (reception side), or can be used in its entirety to study the association between affective states and vocal expressions (production side).

Details

Authors
Organisations
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • Psychology (excluding Applied Psychology)

Keywords

  • Emotion, Nonlinguistic vocalizations, Naturalistic vocalizations, Acoustic analysis
Original languageEnglish
Pages (from-to)758-771
JournalBehavior Research Methods
Volume49
Issue number2
Early online date2016
Publication statusPublished - 2017 Apr 29
Publication categoryResearch
Peer-reviewedYes