The field seems to be a bit divided on this question as the following quotes indicate.
"...many patients with Broca’s aphasia actually do show severe phonemic perception deficits..." (Wilson & Iacoboni 2006)
"Broca’s aphasia is associated with speech perception deficits..." -Iacoboni, in Gallese et al. 2011
"The findings reviewed in the present paper indicate that areas in the frontal lobe involved in speech production are specifically contributing to speech perception, and that activity in the motor system can alter speech discrimination, directly indicating that sensory-motor processes interact during speech comprehension. The experimental evidence of a causal relationship between activity in motor areas and speech perception ([D'Ausilio et al., 2009] and [Meister et al., 2007] ) provide evidence that activation of motor areas during listening to speech is NOT the consequence of a corollary cortico-cortical connection ... but reflects the sensory-motor nature of perceptual representations." (D'Ausilio, et al. 2010)
“…lesion studies show that aphasics with damage to frontal motor-related structures largely retain the ability to perceive speech sounds.” (Hickok 2010)
A critical piece of evidence in this debate is the speech perception ability of individuals with damage to Broca’s area and/or Broca’s aphasia. Proponents of the view that damage to frontal, motor-related regions are critically involved in speech perception often refer to the literature of the 1970s and 1980s which reported various degrees of impairment on some speech perception tasks. But this older literature did not typically utilize neuroradiological data to confirm the location of the lesions, nor did these studies use signal detection methods.
I recently had the opportunity to team up with one of the major players in that earlier literature, Gabriele Miceli and his group, to re-examine this issue in a new sample of aphasics. From Miceli’s database of patients, we identified all of the cases that had substantial damage to Broca’s area, confirmed radiologically; the lesions typically involved surrounding regions as well. Twenty-four cases were identified. Nineteen were classified as Broca’s aphasics, 5 were on the border between Broca's and conduction aphasia, and 1 was classified as a conduction aphasic.
We assessed the ability of these patients to perceive speech sounds using a variety of tasks including same-different syllable discrimination and auditory word-to-picture matching with phonological and semantic distracter pictures. Of course, if Broca’s area and surrounding motor regions are critical for speech sound perception, we would expect to find substantial deficits on our tasks.
In fact, performance was remarkably good. On the syllable discrimination task the group averaged 94% correct with a d’ = 4.18, i.e., 4 standard deviations above chance performance. On the auditory comprehension task, performance was even better at 97% correct. Task still matters!
Speech output fluency varied across the sample. There was one fluent aphasic in the sample (the one not classified as a Broca’s aphasic). The rest were all non-fluent to various degrees, consistent with the diagnosis of Broca’s aphasia. We removed the fluent patient then grouped the rest in terms of their severity of non-fluency -- mild, moderate, severe -- and examined receptive speech abilities as a function of output fluency. If there is a relation between motor speech ability and speech perception, more severely non-fluent patients should perform more poorly on receptive speech tasks. There was no difference between groups (p-value = 0.39).
Thus, the stroke/aphasia literature is consistent with evidence from Wada studies, developmental anarthria studies, normal development studies, and animal studies of speech perception: the motor system is not necessary for speech perception.
D'Ausilio, A., Craighero, L., & Fadiga, L. (in press). The contribution of the frontal lobe to the perception of speech. Journal of Neurolinguistics. doi: 10.1016/j.jneuroling.2010.02.003
Hickok, G. (2010). The role of mirror neurons in speech perception and action word semantics. Language and Cognitive Processes, 25, 749 - 776.
Hickok, G., Costanzo, M., Capasso, R., & Miceli, G. (2011). The role of Broca’s area in speech perception: Evidence from aphasia revisited Brain and Language DOI: 10.1016/j.bandl.2011.08.001
Wilson, S. M., & Iacoboni, M. (2006). Neural responses to non-native phonemes varying in producibility: evidence for the sensorimotor nature of speech perception. Neuroimage, 33(1), 316-325.
Interesting paper, Greg.
Jo Arciuli (U Syd) and I recently completed an fMRI study on probabilistic orthographic cues to grammatical category, finding a 3-way interaction for grammatical category x beginning x ending cues for both errors and activity in the left frontal operculum/insula during a lexical decision task. Poster to be presented at the Neurobiology of Language meeting in November. So, I think your suggestion that the speech-motor system might play a role in orthographic decoding is worth following up.
Sounds interesting. Right, I didn't mention that aspect of the study in the blog entry. While we did not see and effect of (non)fluency on auditory discrimination following Broca's area+ lesions, we did see a significant effect of (non)fluency on a visual-auditory version of the discrimination task. Subjects heard a syllable and saw an orthographic form that either matched or did not; they were asked to indicate whether they matched or not.
More severely non-fluent patients performed more poorly in this task than mild or moderately non-fluent patients suggesting that the frontal lesions are affecting orthographic processes. This may be one reason why Broca's area seems to light up more strongly in any language task that involves written stimuli. It also highlights again, the importance of task selection. If you want to assess speech perception, your results are going to vary depending on whether you choose auditory-visual matching, auditory-auditory discrimination, or word recognition.
Looking forward to seeing you and your poster in Annapolis!
Indeed the paper “adds to a growing body of evidence” that the motor system is not necessary for syllable discrimination and word comprehension of “stimuli delivered at comfortable listening level … in a quiet room”. But unless such a formulation is accepted as a definition of basic listening conditions, the argument whether the motor system is necessary or not will go on.
In Figs 3 & 4 the moderate group is systematically better than the mild one. Why’s that? In which group(s) are the 5 border cases between Broca’s and conduction aphasia?
I suspect you are right. The motor people have retreated to the view that the motor system is important under noisy listening conditions. Let's grant that possibility and see where it takes us theoretically. When the acoustic signal is ambiguous and we nonetheless ask the subject to make a decision, it is obvious that s/he will use whatever information is available to help inform that decision. In real word speech situations, context is probably the dominant factor, where context includes lexical, semantic, syntactic, pragmatic, visual speech ... am I missing any? But because we are clever experimentalists, we remove all those constraining factors. What's left to help people decide? A motor representation maybe? If hearing a speech sound automatically activates a corresponding motor program, as seems to be the case, then if the auditory trace is weak, subjects may consult the motor side to see if that helps. If we mess with the motor representation via TMS or fatigue, we may be able to modulate (bias) the motor representation and therefore modulate (bias) the decision. Not necessarily the percept in the auditory system, but the decision based on a correlated, transformed version of the percept. Viewed in this way, I throw the question back: is the motor system necessary for perceiving speech under noisy conditions? Or is it just involved under contrived lab conditions?
In Figs 3 and 4 the difference between mild and moderate was not statistically reliable. There's nothing there, so no explanation is warranted.
The 5 border cases were by definition all mildly nonfluent. The conduction aphasic was fluent and so not included in those graphs at all.
Pulvermuller & comp. showed that TMS applied to the lips or tongue motor areas affects phoneme discrimination, so motor system may have a function at some conditions. In spite of that a quiet room is somewhat less unnatural than TMS, any theory should take both results into account. But whether is motor system necessary under noisy conditions, I don't dare to guess.
para 2 & 3: OK.
1. Are the TMS induced effects specific to the task? If they assessed word recognition would they see the same effect?
2. Relatedly, people like to say that listening to speech in noise is more natural than clear speech in a quiet environment. True, many situations are noisy. But you'll never be at a cocktail party and make a lexical decision, decide whether two meaningless syllables are the same or different, or push one of three buttons depending on whether your interlocutor said, ba, pa, or ta. You to promote ecological validity? Worry less about noise and more about the TASK!
1. Single words are not typical speech, either. Yet it makes sense to start with phonemes (or better to say syllables), then words, simple sentences etc.
2. Haha, never!
If I may switch over to your MN Forum discussion, I'm just reading Gallese & Lakoff, Cogn. Neuropsychol. 2005, 22, 455. On p. 458 they say:
“We will use the results on monkeys as applying to humans for the simple reason that there is enough evidence to support the notion of an analogy—when not a homology—between the monkey and human brain regions we will be discussing”.
In the MN Forum, Gallese took on a more cautious view. Too bad that he's not willing to discuss here.
Post a Comment