Investigating visual prosody using articulography

Johan Frid, Malin Svensson Lundmark, Ambrazaitis Gilbert, Susanne Schötz, David House

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceedingpeer-review

175 Downloads (Pure)

Abstract

In this paper we describe present work on multimodal prosody by
means of simultaneous recordings of articulation and head movements. Earlier
work has explored patterning, usage and machine-learning based detection of
focal pitch accents, head beats and eyebrow beats through audiovisual recordings.
Kinematic data obtained through articulography allows for more comparable
and accurate measurements, as well as three-dimensional data. Therefore,
our current approach involves examining speech and body movements concurrently,
using electromagnetic articulography (EMA). We have recorded large
amounts of this kind of data previously, but for other purposes. In this paper, we
present results from a study on the interplay between head movements and
phrasing and find tendencies for upward movements occuring before and
downward movements occuring after prosodic boundaries.
Original languageEnglish
Title of host publicationProceedings of4th Conference of The Association Digital Humanities in the Nordic Countries
Subtitle of host publicationCopenhagen, March 6-8 2019
PublisherCEUR-WS
Publication statusPublished - 2019
EventDHN 2019 Digital Humanities in the Nordic Countries 4th Conference - University of Copenhagen, Copenhagen, Denmark
Duration: 2019 Mar 62019 Mar 8
Conference number: 4
https://cst.dk/DHN2019/DHN2019.html

Conference

ConferenceDHN 2019 Digital Humanities in the Nordic Countries 4th Conference
Country/TerritoryDenmark
CityCopenhagen
Period2019/03/062019/03/08
Internet address

Subject classification (UKÄ)

  • Language Technology (Computational Linguistics)
  • General Language Studies and Linguistics

Fingerprint

Dive into the research topics of 'Investigating visual prosody using articulography'. Together they form a unique fingerprint.

Cite this