Neuroscience Seminar Series: Jeremy Skipper (UCL): "Echoes of the spoken past: How the brain 'hears' context during spoken language comprehension"
Tuesday, 10 November 2015, 1pm to 2pm
What do we hear when someone speaks? What does auditory cortex (AC) do with that information? I present neuroimaging data suggesting that the impression that we simply hear sounds and that AC is the bottom of feedforward processing hierarchy are the wrong answers to these questions. Rather, when the brain is engaged by naturalistic language stimuli, it appears to dramatically self-organize to use available contextual information. Context in experiments includes preceding sounds and discourse content, observable emotional facial displays and co-speech gestures, and memories of prior experiences observing speech-associated mouth movements and reading. This contextual information seems to be the starting point for the formation of hypotheses that are used to make predictions about the nature of the ambiguous information that might arrive in AC. Strong predictions result in a large conservation of metabolic resources in AC, presumably because no further evidence from the auditory world is required to confirm hypotheses. Thus, results suggest that a great deal of what we hear is not sound but, rather, an echo of internal knowledge that shapes and constrains interpretation of the impoverished information reaching AC. That is, hearing speech and AC functioning is a constructive process that relies on contextual information available during real-world communication.
Host: Kate Watkins & Daniel Lametti