Wednesday, August 15, 2007

Meta-linguistic tasks -- Part 2

Our observation is that data from meta-linguistic tasks (e.g., syllable discrimination or identification) impedes progress in understanding the functional anatomy of speech processing. How so?

Take lesion data as one example. If you look at the evidence, you find that deficits on syllable discrimination tasks are commonly observed following left hemisphere damage, with the most severe deficits associated with frontal and/or parietal lesions. The straightforward conclusion from such a result is that speech perception is supported predominantly by left frontal and/or parietal regions. The problem with this conclusion is that patients with damage to frontal and/or parietal regions in the left hemisphere typically have quite good auditory comprehension. More to the point, as Blumstein* has pointed out, "Significantly, there does not seem to be a relationship between speech perception abilities [performance on discrimination tasks] and auditory language comprehension. Patients with good auditory comprehension skills have shown impairments in speech processing; conversely, patients with severe auditory language comprehension deficits have shown minimal speech perception deficits." (p. 924)

This is a bit of a paradox: why is it that deficits on syllable discrimination tasks don't predict auditory comprehension problems? There are two possibilities. One is that auditory comprehension tasks contain contextual cues that allow the listener to get by even with an imperfect phonemic processor. The other possibility is that syllable discrimination tasks are invalid measures of normal speech sound processing. We've argued that the latter is true. More on that next entry...

*Blumstein, S.E. (1995). The Neurobiology of the Sound Structure of Language. In Gazzaniga (E.d.) The Cognitive Neurosciences. MIT Press.

No comments: