Semantic Self-monitoring in Speech

Projekt: Forskning

Beskrivning

During my PhD-work, I developed what my collaborators and I call Real-time Speech Exchange (RSE): a new technique for feedback manipulation during speech that allows us to investigate the conceptualization process and the use of auditory feedback. Participants wear specially designed sound isolating headsets and RSE allows us to surreptitiously record words or phrases that they utter, and then to play this recording back to them later in the test at the exact same time as they utter another word. As we simultaneously block out the feedback of what they are actually saying, we create situations where participants say one thing, but receive real-time, timing matched auditory feedback suggesting they are saying something else. In a previous experiment (Lind, Hall, Breidegard, Balkenius, & Johansson, 2014a), we used RSE to show that speakers often do not detect manipulations, and that they often believe themselves to have said the inserted word, rather than the word they actually said. This finding that speakers listen to themselves in order to know what they are saying is important and has consequences for the way speech production and self-monitoring should be modelled. Since it is the first of its kind, this finding raises many new questions and the methods and the RSE-technique we have developed have opened up a completely new approach to answering them. The present project is designed to investigate the phenomenon further from an experimental as well as a theoretical perspective.

Thus, there are three parallel goals of the present project: (i) we will use RSE to investigate the effects of social and visual context upon speech production and self-monitoring and to investigate neurocognitive and behavioral effects of auditory vs internal self-monitoring; (ii) to advance a theoretical model of speech production and self-monitoring which can incorporate our findings; and (iii) to further develop RSE as a research tool to include other forms of experimental method, specifically eye-tracking and EEG.
StatusSlutfört
Gällande start-/slutdatum2016/03/242019/03/23

Samarbetspartner

Participants