Since listeners usually look at the speaker's face, gestural information has to be absorbed through peripheral visual perception. In the literature, it has been suggested that listeners look at gestures under certain circumstances: 1) when the articulation of the gesture is peripheral; 2) when the speech channel is insufficient for comprehension; and 3) when the speaker him- or herself indicates that the gesture is worthy of attention. The research here reported employs eye tracking techniques to study the perception of gestures in face-to-face interaction. The improved control over the listener's visual channel allows us to test the validity of the above claims. We present preliminary findings substantiating claims 1 and 3, and relate them to theoretical proposals in the literature and to the issue of how visual and cognitive attention are related.
Bibliographical noteThe information about affiliations in this record was updated in December 2015.
The record was previously connected to the following departments: Linguistics and Phonetics (015010003), Humanities Lab (015101200)
Subject classification (UKÄ)
- General Language Studies and Linguistics
- visual perception
- eye movements