Monday, November 26, 2012

Cortex special issue: Language and the motor system

Observation #1.  In the editorial Cappa and Pulvermuller write,
Whereas the dominant view in classical aphasiology had been that superior temporal cortex (“Wernicke’s area”) provides the unique engine for speech perception and comprehension (Benson, 1979), investigations with functional neuroimaging in normal subjects have shown that even during the most automatic speech perception processes inferior fronto-central areas are being sparked (Zatorre et al., 1992)
I take it that they are referring to Zatorre's task in which subjects are listening to pairs of CVC syllables, some of which are words, some of which are not, and alternating a button press between two keys.  Contrasted with noise, activation foci were reported for automatic-speech-perception-of-random-CVC-syllables-while-alternating-button-pressing in the superior temporal gyrus bilaterally, the left middle temporal gyrus, and the left IFG.  Clearly the stronger activations in the temporal lobe (nearly double the z-scores) are doing little in the way of speech perception and it's the IFG activation that refutes the classical view.

I wonder why no mention was made of a rather nifty study published around the same time by Mazoyer et al. in which a larger sample of subjects listened to sentences of various sorts and which did not result in consistent activation in the IFG. This is a finding that has persisted into more recent research: listening to normal sentences does not result in robust IFG activation.  Sometimes you see it, sometimes you don't (see Rogalsky & Hickok for a review). Superior temporal cortex, that area that people were writing about on their IBM selectrics (Google it, youngster) is not so fickle.  Present speech and it lights up like a sparkler on Independence Day.

Hopes of a balanced (and therefore useful) volume already sinking.  And I haven't even made it past the first paragraph of the editorial.

Mazoyer, B. M., Tzourio, N., Frak, V., Syrota, A., Murayama, N., Levrier, O., Salamon, G., Dehaene, S., Cohen, L., & Mehler, J. (1993). The cortical representation of speech. Journal of Cognitive Neuroscience, 5, 467-479.

Rogalsky, C., & Hickok, G. (2011). The role of Broca's area in sentence comprehension. Journal of Cognitive Neuroscience, 23, 1664-1680.


Jonathan Peelle said...

I look forward to your thoughts on language and the motor system. I suspect that we largely agree on the degree to which the motor system is necessary for language comprehension. However, I do take issue with your assertion that listening to sentences does not result in consistent IFG activation. I am not aware of any study that is (a) reasonably powered (say, 16+ subjects), (b) contrasts sentences with an acoustic control (such as signal-correlated noise) (or performs parametric correlations with intelligibility), and (c) is not contaminated by background noise (for fMRI, uses sparse scanning), and does NOT find left inferior frontal cortex activation for sentences. Of course, this assumes that the above limitations are sensible, but I think it's hard to argue they are not. If there are exceptions to this, then I would be grateful to have them pointed out.

That being said, I think that we are on the same page in thinking that many of the explanations for this activity don't necessarily hold up, which I took to be the main point of Rogalsky and Hickok (2011). I fully agree, and further will be quick to acknowledge that I don't have a proven explanation for the cognitive processes going on in LIFG during sentence comprehension, which of course is important (!). (I tend to think of it probably closest to the Hagoort view of semantic integration, but wouldn't defend this to the death just yet.)

And, I also fully agree that many types of sentence comprehension can be performed by individuals with damage to left inferior frontal cortex. I don't think that LIFG is necessary for all types of sentence processing, but at the same time find it is clearly active when healthy adults are listening to connected speech.

I won't be as boring as to list the studies which support this, but a selection can be found in my recent (2012) Frontiers opinion piece. Patti Adank's meta-analysis of speech processing also supports left inferior frontal activity for sentences > words, which I think is consistent with this view. I would hope that in service of the more balanced (and therefore useful) view ;-), some of these citations may find their way into future papers in which you address the issue. As always, I am happy to talk further offline if that would be useful!

Adank P (2012) Design choices in imaging speech comprehension: An activation likelihood estimation (ALE) meta-analysis. NeuroImage 63:1601-1613.

Peelle JE (2012) The hemispheric lateralization of speech processing depends on what "speech" is: A hierarchical perspective. Front Hum Neurosci 6:309.

Greg Hickok said...

Hi Jonathan,
You've convinced me that background scanner noise indeed changes the activation pattern somewhat so it is a fair critique of what I claimed about IFG and sentences. You do agree that there are plenty of good studies that use continuous acquisition that fail to show IFG activation though, no? Rogalsky et al. J. Neurosci is a good example. N=20, not much going on in Broca's area for sentences or music.

STG has no problem activating with or without scanner noise. So how would you explain the role of IFG in sentence processing if it is so fickle that it doesn't reveal itself against background noise? Wouldn't you expect *auditory* areas to be more sensitive to this? Why doesn't Broca's area ramp up its activity when speech is presented in a noisy environment? After all, those TMS studies don't find any effects of motor stimulation unless speech is presented in noise?

I just find the whole obsession with IFG and the motor system in language processing a bit puzzling given the weight of the data against the idea.

Jonathan Peelle said...

hi Greg,

Yes, I think we mostly agree. IFG is clearly not a sensory area the way superior temporal cortex is, and the response there is not as strong during sentence comprehension. I also agree that in studies using continuous scanning, finding LIFG activity for connected speech is less consistent. The issue of acoustic challenge and LIFG activity is something I hope to have more to share on soon.

Although I mostly agree with your point about the Rogalsky et al (2011) study, I also point out that (I think) the sentences were Jabberwocky. There are probably interesting differences between Jabberwocky and coherent speech; and in fact, Matt has also shown that these differences in activity also depend on SNR (which relates to the background scanner noise issue) (Davis et al, 2011).

Anyway, I think many of the issues you bring up are sensible, but also resolvable. And I don't think any of them necessitate relying on a motor system account of the kind that Cappa and Pulvermuller would argue for.

Davis MH, Ford MA, Kherif F, Johnsrude IS (2011) Does semantic context benefit speech understanding through "top-down" processes? Evidence from time-resolved sparse fMRI. J Cogn Neurosci 23:3914-3932.