Thursday, August 23, 2007

Meta-linguistic tasks -- Part 3

What leads us to the conclusion that meta-linguistic tasks, such as syllable discrimination, are not valid measures of normal speech sound processing? The data tell the story:

Speech sound processing in comprehension vs. in syllable discrimination double dissociate, even when contextual cues are controlled in comprehension tasks. We reviewed the evidence most thoroughly in our 2004 Cognition paper. There are several reports that examined phoneme identification and/or discrimination that we described in that paper (Basso et al, 1977; Blumstein et al., 1977; Caplan et al., 1995), but the Miceli et al., 1980 (Brain and Language, 11:159-169) paper is worth highlighting again. They studied 60+ aphasics using a CCVC syllable discrimination task and an auditory comprehension task using word-to-picture matching. Critically, the comprehension task employed phonological and semantic foils. The inclusion of phonological foils (e.g., a picture of a pear if the stimulus word is bear) minimizes the possibility of using contextual cues in comprehension. Performance was categorized into normal and pathological based on comparison with age-matched controls. The table, reproduced from our Cognition article, summarizes the findings. Notice that 19 patients had pathological performance on the discrimination task yet were normal on the comprehension task, and 9 showed the reverse pattern. A clear double-dissociation.

Anatomical correlations with syllable discrimination deficits are also revealing. The most severe deficits in syllable discrimination tasks are associated with frontal lobe lesions. For example, Gainotti et al. 1982 (Acta Neurol. Scandinav. 66: 652-665) report error rates as a function of lesion location. Patients with left hemisphere lesions restricted to the frontal or parietal lobes made significantly more errors than patients with lesions restricted to the temporal lobe. The worst performance was found in frontal patients. This is an important observation because (i) it suggests that deficits on syllable discrimination tasks are not particularly related to auditory processes (auditory cortex damage appears neither necessary nor sufficient to produce the deficit), and (ii) since frontal or parietal damage typically spares lexical comprehension, such a finding provides further evidence for the non-relation between auditory comprehension and syllable discrimination tasks.

Conclusion: syllable discrimination is not a valid measure of speech sound processing, at least in the context of aphasia. What we have suggested is that performance of syllable discrimination tasks requires frontal-lobe related cognitive processes, such as working memory, that are not as critical for normal auditory comprehension, and it is these processes that are being disrupted by frontal and/or parietal lesions, rather than the (bilateral!) temporal lobe-based mechanisms involved in speech sound processing that are critical to auditory comprehension.

Wednesday, August 15, 2007

Meta-linguistic tasks -- Part 2

Our observation is that data from meta-linguistic tasks (e.g., syllable discrimination or identification) impedes progress in understanding the functional anatomy of speech processing. How so?

Take lesion data as one example. If you look at the evidence, you find that deficits on syllable discrimination tasks are commonly observed following left hemisphere damage, with the most severe deficits associated with frontal and/or parietal lesions. The straightforward conclusion from such a result is that speech perception is supported predominantly by left frontal and/or parietal regions. The problem with this conclusion is that patients with damage to frontal and/or parietal regions in the left hemisphere typically have quite good auditory comprehension. More to the point, as Blumstein* has pointed out, "Significantly, there does not seem to be a relationship between speech perception abilities [performance on discrimination tasks] and auditory language comprehension. Patients with good auditory comprehension skills have shown impairments in speech processing; conversely, patients with severe auditory language comprehension deficits have shown minimal speech perception deficits." (p. 924)

This is a bit of a paradox: why is it that deficits on syllable discrimination tasks don't predict auditory comprehension problems? There are two possibilities. One is that auditory comprehension tasks contain contextual cues that allow the listener to get by even with an imperfect phonemic processor. The other possibility is that syllable discrimination tasks are invalid measures of normal speech sound processing. We've argued that the latter is true. More on that next entry...

*Blumstein, S.E. (1995). The Neurobiology of the Sound Structure of Language. In Gazzaniga (E.d.) The Cognitive Neurosciences. MIT Press.

Thursday, August 9, 2007

Meta-linguistic tasks -- Part 1

This is the first entry in a thread on problems associated with the use of meta-linguistic tasks in studying the neuroscience of language. By meta-linguistic, I mean tasks that require explicit attention to some subcomponent of linguistic processing that normally isn't consciously accessed during natural language processing. So whereas the meaning of an utterance is regularly accessed under naturalistic conditions ('Wow, David's talk was full of false statements!'), the phonemic structure, for example, largely goes unnoticed to the listener ('Wow, David uttered the syllable /ba/ 16 times during his talk!'). We could probably argue about how to define meta-linguistic, but I hope the basic contrast is clear enough.

Task effects represent one of the major issues David and I have been harping on in our papers over the last several years, and one that is probably still fairly controversial. Here is one conclusion we have come to regarding the use of meta-linguistic tasks:

Data from meta-linguistic speech tasks generally impedes progress in understanding the functional organization of speech processing.

This was certainly true in the past, and probably continues today. This is not to say that speech scientists should be banned from, say, asking their subjects to discriminate pairs of syllables. In fact, we have been known to employ meta-linguistic tasks in our own studies. Rather, the point is that data from such tasks (any task actually) should be interpreted very carefully in the context of the cognitive operations involved and their relation to those processes involved in more natural speech processing (i.e., for comprehension). In many cases, unfortunately, findings from meta-linguistic tasks have little relevance to understanding normal language processing, and if one assumes such findings are generalizable to normal situations, we end up barking up the wrong gyrus.

Based on some anonymous reviews of our papers, this claim remains either contentious or misunderstood. The goal of this thread is to clarify and underline our position on this issue, hopefully with some interesting discussion.

More to follow...

Wednesday, August 1, 2007

Speech and Language post-doc, Cambridge, UK

Speech and Language, MRC Cognition and Brain Sciences Unit, Medical Research Council (MRC)
MEG post-doc position, Cambridge, UK (Postdoctoral Position)
Cognition Brain Sciences Unit – Cambridge

Career Development Fellow
Ref: 2007-442
The MRC Cognition and Brain Sciences Unit (CBSU) is an internationally renowned research institute with state-of-the-art cognitive neuroscience facilities, including on-site fMRI, MEG, and EEG laboratories, and neuropsychological patient panel and genotyped panel of healthy volunteers. Applications are invited for a full-time Career Development Fellow.

This is a three year post-doctoral position to conduct neuro-imaging research into the dynamic neural systems underlying human language comprehension, working in a lively interdisciplinary cross-linguistic research environment led by

William Marslen-Wilson, with access to state-of-the art MEG (306-channel Elekta/Neuromag VectorView) and MRI (Siemens Trio 3T) imaging facilities.

You will have doctoral training in neuro-imaging (preferably MEG or EEG) with a strong interest in studying the neuro-cognition of language, previous experience with experimental psycholinguistic research, and the ability to work collaboratively but independently within an active cross-linguistic research team.

The starting salary will be in the range of £24,993 - £30,945 per annum, depending upon qualifications and experience. This is supported by a flexible pay and reward policy, and optional MRC final salary Pension Scheme. We offer 30 days annual leave entitlement. On site parking is available.

For further information and an application pack please contact the recruitment team by e-mail: or telephone 01793 301154 quoting reference 2007-442. Please include a CV and covering letter, stating the names and addresses of two professional referees with your application form.
Closing date: 22nd August 2007

For further information about MRC visit

The Medical Research Council is an Equal Opportunities Employer.
‘Leading Science for Better Health’

Contact Information:
MRC Cognition and Brain Sciences Unit
15 Chaucer Road
Cambridge CB2 7EF