Ok, first of all, I shouldn't be working on a weekend. That's David's job, but here I am anyway...
By leaving the issue out of my previous post, I was hoping someone would raise a question about whether motor-related information CAN influence speech perception, and how that might be explained on an "acoustic" theory of speech perception. Without even encouraging him, one of my own students brought it up. (I swear it wasn't a plant.) Kenny Vaden made the following point in a comment on my last post:
"While aphasias undermine the MT [Motor Theory] claim that phonological representations are *completely* motoric/gestural, the survival of speech perception without production does not mean that motoric information is not available to speech perception at all."
He's absolutely right. Knowledge of how speech is produced does appear to influence our perception of speech, at least under some circumstances. The paper we read discusses several lines of evidence in support of this position, and the data are reasonably compelling. The most obvious demonstration of this is probably the McGurk effect, but there are others.
So we must acknowledge that the motor knowledge can influence perception. But does this mean that an acoustic model is not correct? Do we have to admit that at least part of speech perception, or speech perception under some circumstances, involves perceiving gestures? No, we don't. Here's a simple explanation: knowledge of how speech is produced can have a top-down influence on the acoustic perception of speech information. Top-down expectations of a variety of sorts can influence all kinds of perceptual events, including speech. For example, the lexical status of a CVC syllable affects the perception of its constituent phonemes (e.g., category boundaries shift toward the lexical item in a b-p continuum with bag and pag as endpoints). So why can't motor expectations (e.g., forward modeling, predictive coding, analysis by synthesis -- whatever you want to call it) have a top down influence on acoustic representations? They can.
Conclusion: motor effects on perception do not falsify an acoustic theory of speech perception. But they do suggest that motor knowledge can influence perception in a top-down fashion, just like many types of knowledge can.
Top-down is a convenient formula, but do we have indeed any evidence that would allow us to distinguish vertical information transfer from a horizontal one?
Why can we exclude the possibility that both acoustics and articulation contribute to the speech perception in the parallel bottom-up fashion, but the contribution of the articulatory pathway is in general much weaker?
Such approach, weak contribution of the motor pathway, might gain some support from a study by Meister and colleagues "The essential role of premotor cortex in speech perception" Curr. Biol. 17(19), 1692-96. Having stimulated the premotor cortex with rTMS the authors have observed a slight decrease in the CV syllable discrimination rate.
Thanks for the comment Daniel. That's exactly the question we need to ask next: is there any evidence for top-down vs. a more horizontal architecture. I haven't looked into in any detail yet. I think David P. might have some data on this regarding the timecourse of visual and auditory speech integration -- maybe we can coax a comment out of him on this.
In the Galantucci et al. paper, they discussed some evidence that seemed, to my reading, to support a top-down model of the influence of motor knowledge. I don't have the paper in front of me, but it was in connection with the study where people viewed changes in arm position. Knowledge of how arms move influenced the perceived trajectory of the movement, but only at longer ISIs. At shorter ISIs, the perception violated biologically possible limb trajectories. This suggested to me that the motor knowledge arrives later in the perceptual process which may be explained by a top-down architecture.
Regarding the Meister et al. study -- as we've pointed out many times before, CV discrimination is dependent on frontal lobe integrity, unlike speech recognition/comprehension. So in effect, using a discrimination task to assess frontal motor involvement in speech perception is cheating if you want to generalize the claim to speech recognition (i.e., ecologically valid speech processing). If the claim is just that frontal motor cortex is important for CV discrimination (which doesn't predict speech recognition behavior), then I'm totally good with that, but of course, we knew that already from aphasia research published in the 1970s and 1980s.
Good points--the possibility of a 'perceptual theory of speech production' means that we can now only count as evidence for motor theory cases where pure perception is affected by motor properties. My friend Henny Yeung was talking to me a few years ago about an experiment where blowing a puff of air against a person's throat would affect their perception of aspiration. Don't know if that was a real or thought experiment, but seems like that would be more the type of thing that you'd want to look for.
I see the top-down point, but it puts a lot of constraint on the evidence that we would consider definitive--since now we can just explain any motor effects as 'late', and since you can see top-down effects within a few hundred ms, seems like the only evidence that could speak to the question would have to use some kind of electrophysiological measure...
Thanks for your reply Greg. Word perception and syllable discrimination do doubly dissociate in some cases; in other cases, I guess, they do not. We don't have a right to make general conclusions that nowhere in the brain does impaired syllable perception predict impaired word recognition.
Stimulation in the Meister and colleagues' study was more posterior and higher than Broca's area for which most of arguments on syllable/word dissociation were developed. I'm not aware of an aphasia case arising from destroyed upper PMC (not inferiofrontal areas) that would have word perception spared with syllable perception affected. Thus we can not straight away exclude the possible effect of PMC TMS stimulation on word perception as well.
This is not to say that experiment with real words is not needed to complement Meister et al. study, but to say that we have no grounds to suggest that the observed phenomena have nothing to do with speech perception.
Another good point Daniel. I would actually predict that word recognition and syllable discrimination are correlated in patients with damage to the superior temporal lobe. The impairments however, would probably not be identical in magnitude due to differences in lateralization: disc = more left involvement, recog = more bilateral.
Well, transcortical motor aphasia has a loose association with lesions either anterior or posterior to Broca's region, but the real point is that most Broca's aphasics have very large lesions that involve more dorsal PMC areas, so I'd be willing to bet that the two tasks dissociate following damage to that region.
Post a Comment