Pupil dilation reflects the dynamic integration of audiovisual emotional speech

Pablo Arias Sarah, Lars Hall, Ana Saitovitch, Jean Julien Aucouturier, Monica Zilbovicius, Petter Johansson

Research output: Contribution to journalArticlepeer-review

Abstract

Emotional speech perception is a multisensory process. When speaking with an individual we concurrently integrate the information from their voice and face to decode e.g., their feelings, moods, and emotions. However, the physiological reactions—such as the reflexive dilation of the pupil—associated to these processes remain mostly unknown. That is the aim of the current article, to investigate whether pupillary reactions can index the processes underlying the audiovisual integration of emotional signals. To investigate this question, we used an algorithm able to increase or decrease the smiles seen in a person’s face or heard in their voice, while preserving the temporal synchrony between visual and auditory channels. Using this algorithm, we created congruent and incongruent audiovisual smiles, and investigated participants’ gaze and pupillary reactions to manipulated stimuli. We found that pupil reactions can reflect emotional information mismatch in audiovisual speech. In our data, when participants were explicitly asked to extract emotional information from stimuli, the first fixation within emotionally mismatching areas (i.e., the mouth) triggered pupil dilation. These results reveal that pupil dilation can reflect the dynamic integration of audiovisual emotional speech and provide insights on how these reactions are triggered during stimulus perception.

Original languageEnglish
Article number5507
JournalScientific Reports
Volume13
Issue number1
DOIs
Publication statusPublished - 2023 Dec

Subject classification (UKÄ)

  • Comparative Language Studies and Linguistics

Fingerprint

Dive into the research topics of 'Pupil dilation reflects the dynamic integration of audiovisual emotional speech'. Together they form a unique fingerprint.

Cite this