Acoustic features of multimodal prominences: Do visual beat gestures affect verbal pitch accent realization?

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceeding

Abstract

The interplay of verbal and visual prominence cues has attracted recent attention, but previous findings are inconclusive as to whether and how the two modalities are integrated in the production and perception of prominence. In particular, we do not know whether the phonetic realization of pitch accents is influenced by co-speech beat gestures, and previous findings seem to generate different predictions. In this study, we investigate acoustic properties of
prominent words as a function of visual beat gestures in a corpus of read news from Swedish television. The corpus was annotated for head and eyebrow beats as well as sentence-level pitch accents. Four types of prominence cues occurred
particularly frequently in the corpus: (1) pitch accent only, (2) pitch accent plus head, (3) pitch accent plus head plus eyebrows, and (4) head only. The results show that (4) differs from (1-3) in terms of a smaller pitch excursion and shorter syllable duration. They also reveal significantly larger pitch excursions in (2) than in (1), suggesting that the realization of a pitch accent is to some extent influenced by the presence of visual prominence cues. Results are discussed in terms of the interaction between beat gestures and prosody with a potential functional difference between head and eyebrow beats.

Details

Authors
  • Gilbert Ambrazaitis
  • David House
Organisations
External organisations
  • KTH Royal Institute of Technology
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • General Language Studies and Linguistics

Keywords

Original languageEnglish
Title of host publicationProceedings of The 14th International Conference on Auditory-Visual Speech Processing (AVSP2017)
EditorsSlim Ouni, Chris Davis, Alexandra Jesse, Jonas Beskow
PublisherKTH
Number of pages6
Publication statusPublished - 2017
Publication categoryResearch
Peer-reviewedYes
EventInternational Conference on Auditory-Visual Speech Processing - KTH Department of Speech Music and Hearing, Stockholm, Sweden
Duration: 2017 Aug 252017 Aug 26
Conference number: 14
http://avsp2017.loria.fr/

Publication series

Name
ISSN (Electronic)2308-975X

Conference

ConferenceInternational Conference on Auditory-Visual Speech Processing
Abbreviated titleAVSP 2017
CountrySweden
CityStockholm
Period2017/08/252017/08/26
Internet address

Total downloads

No data available

Related research output

Ambrazaitis, G. & House, D., 2016, p. 319-319. 1 p.

Research output: Contribution to conferencePoster

Ambrazaitis, G., Malin Svensson Lundmark & House, D., 2015, Proceedings from Fonetik 2015: Lund, June 8-10, 2015. Working Papers 55. 2015. . Svensson Lundmark, M., Ambrazaitis, G. & van de Weijer, J. (eds.). Centre for Languages and Literature, Lund University, Vol. 55. p. 11-16 6 p. (Working Papers in General Linguistics and Phonetics; vol. 55).

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceeding

View all (3)

Related projects

View all (1)