Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Speech articulation is a highly complex process involving the co-ordination of multiple vocal tract effectors, on rapid timescales. While much work has addressed the acoustic consequences of speech, and their neural bases, comparably little work has charted the neural representation of speech articulations themselves. Moreover, understanding of the physical vocal tract dynamics that are essential to speech production remains limited, with a paucity of data related to articulatory learning following the acquisition of non-native speech sounds. In this talk, we present a novel framework for the investigation of plasticity of speech articulation and its representation in the brain, using a combination of articulatory training, real-time vocal tract MRI, and functional MRI. We show that novel articulatory behaviour can be learned, as indexed by acoustic metrics and vocal tract dynamics. Moreover, we demonstrate for the first time that multivariate analysis techniques allow us to integrate vocal tract imaging and fMRI, to probe the representational basis of vocal tract behaviour during sensorimotor transformation and articulation of speech.