Tuesday, January 27, 2009

Where is 'where' in the human auditory cortex?

This is the title of an interesting and highly cited paper by Zatorre and colleagues (2002) which questions the existence of a dedicated spatial processing system (or stream) in human auditory cortex. I have highlighted this issue previously here on Talking Brains in the context of my (now) former graduate student Kevin Smith's fMRI study on the topic. Having spent the last year+ trying to get the paper published, I'd like to return to the issue, in part to vent a little frustration and in part to hopefully clarify a bit of confusion -- maybe some of my own.

First the story on the paper, which now has the proud distinction of being rejected from the Journal of Neuroscience, not once, but twice! Our basic result: we failed to find a region within the planum temporale that showed a selective response to auditory spatial manipulations (more details in the previous post; see above link). As with previous studies, we easily identified a region that showed a greater response to a stimulus (we used human speech) that spatially varied than to a stationary stimulus. However, we found that simply by adding additional sound objects (more talkers) at the same stationary location we generated just as much of an activation increase in the "spatial" ROI. That is, "spatial" regions are equally sensitive to non-spatial manipulations, specifically the number of auditory objects in the signal; they are not selective for spatial manipulations. This surprised me and we thought J.Neuroscience would be interested.

First submission: a reviewer criticized our use of passive listening, stating that spatial activations in the PT were only found in studies using active tasks, and that is why we found no selective spatial activation. Editors rejected the paper. I appealed the decision (there is a formal process) citing several different papers that showed spatial activations in the PT using passive stimulation. The editors invited a resubmission of the paper along with the opportunity (actually, requirement) to pay another $100 submission fee.

Second submission: round one went fairly well (they do the reviews from scratch after an appeal) with the usual comments and criticisms, and we were invited to revise and resubmit. We addressed the concerns pretty well we thought and satisfied one of the reviewers, but another (apparently new!) reviewer criticized the paper harshly. Quoting from a 2000 paper by Rauschecker and Tian (a macaque study) the reviewer stated that our "core prediction" that of "signal increase in PT as a function of sound sources" is predicted by proponents of a dorsal, spatial processing stream. Said reviewer went on to quote from Talking Brains (!) implying that we don't understand our own data! Wasn't that the review comment David got once? :-) Editors rejected the paper again.

So this is a bit irritating (please someone correct me if I'm missing something!) because (i) our core prediction was NOT that the PT should be modulated by the number of sound sources, but rather that the PT should NOT have a spatially selective region in it, and (ii) since Rauschecker and Tian's statements, there have been several claims that there is a spatially selective region within the human PT. Here's a few quotes and pictures:

Warren & Griffiths, 2003
The present study ... suggests anatomically distinct spatial (posteromedial) and object (anterolateral) processing mechanisms within PT... (p. 5803)

Barrett & Hall, 2006:
Manipulating the temporal parameters that inform changes in spatial location on one hand and pitch perception on the other produced regions of cortical activation that were distinct apart from a small region of overlap in anterior PT. Responses to pitch occurred within and anterior to HG, while responses to changes in spatial location were restricted to posterior non-primary auditory cortex in PT (p. 976)

(Pink=spatial, yellow=non-spatial, blue=overlap)
Altmann et al., 2007:
Both pattern and location changes implicated partly overlapping areas with the pattern changes additionally activating more anterior areas and the location changes implicating posterior temporal areas... This was confirmed by a contrast analysis ... that revealed significantly stronger fMRI responses ... for pattern compared to location changes within the bilateral anterior aspects of the STG and STS. ...we observed significantly stronger fMRI responses for the location compared to pattern changes within the bilateral inferior parietal lobule (IPL), in the right temporo-parietal junction (TPJ) and right anterior insula (p. 1196-1197)

(green=spatial, red=non-spatial, yellow=overlap)

I take this to mean that more than one highly competent and well-respected groups believe that there is a region within the human PT that is selective for spatial processing. (There may also be a region that is sensitive to both spatial and non-spatial signals according to these authors, but still the claim seems to be that at least some part of the PT is indeed space dedicated.) Using a different kind of non-spatial manipulation than any of these studies, i.e., adding simultaneously presented auditory objects, we showed that the "spatial" activations are not purely spatial. Either I'm seriously misreading this literature or somebody owes me $100. (Dear reviewer: I know you read this blog, so either correct me, or pay up! ;-) )

In any case, I still think Zatorre is right. There is no "spatial" area. Instead, and consistent with Zatorre et al.'s suggestion, these spatial activations may reflect auditory stream segregation, using spatial cues, rather than spatial processing per se.


C ALTMANN, C BLEDOWSKI, M WIBRAL, J KAISER (2007). Processing of location and pattern changes of natural sounds in the human auditory cortex NeuroImage, 35 (3), 1192-1200 DOI: 10.1016/j.neuroimage.2007.01.007

D BARRETT, D HALL (2006). Response preferences for “what” and “where” in human non-primary auditory cortex NeuroImage, 32 (2), 968-977 DOI: 10.1016/j.neuroimage.2006.03.050

Warren JD, Griffiths TD (2003) Distinct mechanisms for processing spatial sequences and pitch sequences in the human auditory brain. The Journal of Neuroscience 23:5799-5804.

Robert J. Zatorre, Marc Bouffard, Pierre Ahad, Pascal Belin (2002). Where is 'where' in the human auditory cortex? Nature Neuroscience, 5 (9), 905-909 DOI: 10.1038/nn904

Wednesday, January 21, 2009

Imagery or meaning in category-specific brain activity?

A common finding in the functional neuroimaging of "semantic processing" is that motor areas activate, somatotopically even, during the processing of action-related words. For example, when subjects process (e.g., read) a word such as kick, motor areas corresponding to leg movements are activated, whereas for a word such as kiss, lip areas are activated. The problem with this type of result is that it cannot determine whether motor cortex is activating because it is part of the semantic representation of the words (the usual claim) or simply because there is an association between the word meanings and the actions they refer to. That is, the meaning of kick may be stored independently of motor cortex, but because this meaning is associatively linked to kicking actions, access to the meaning of kick results in spreading activation within brain networks to which it is associatively linked, including the leg region of motor cortex.

A recent study by Hauk et al. (2008) attempted to address this "imagery" vs. "meaning" ambiguity. They designed a decent study. There was a set of action words (e.g. ‘grasp’, ‘limp’, ‘bite’) and a set of non-action words that had imageable attributes (e.g. ‘snow’, ‘blond’, ‘cube’) and they looked for regions that showed word frequency effects that were specific to one category versus the other. The logic was that imagery processes should not so sensitivity to word frequency effects, as word frequency effects are associated with lexical-level processes. We could quibble here -- e.g., they didn't control concept familiarity -- but let's not.

So what did they find? Did they find category-specific frequency effects? Yes! They were able to identify one region that only showed frequency effects for action words and a different region that only showed frequency effects for non-action words. Does this mean that motor cortex, which presumably is coding action semantics, really is coding action semantics? Well, no. It turns out that the frequency effects were in the temporal lobe, not the frontal lobe:

If you don't read this paper carefully, it is easy to miss this little fact. For example, the abstract states:

we corroborated previous results showing that action-relatedness modulates neural responses in action-related areas, while word imageability modulates activation in object processing areas.

This is not a false statement, it's just that the "action-related areas" they are referring to are in the middle temporal gyrus. They go on to write,

we suggest that category-specific brain activation reflects distributed neuronal ensembles, which ground language and concepts in perception-action systems of the human brain

Again, based on their findings, these "perception-action systems" don't seem to involve motor cortex.

The findings of this study are interesting, but they seem to support the view that even action concepts rely more on posterior brain networks than frontal, motor-related brain networks.

Olaf Hauk, Matthew H. Davis, Ferath Kherif, Friedemann Pulvermüller (2008). Imagery or meaning? Evidence for a semantic origin of category-specific brain activity in metabolic imaging European Journal of Neuroscience, 27 (7), 1856-1866 DOI: 10.1111/j.1460-9568.2008.06143.x

Tuesday, January 13, 2009

Seeking research assistant/lab manager for Poeppel Lab at NYU

I have moved to the psychology department at NYU, and I am looking for a person to do … well, everything. Basically, I need someone who runs me. I think that working in my lab is (or can be…) pretty fun and interesting, but I am a horrible boss, mainly because I am bad at delegating. (I don't think that I am a bad boss because I am an ogre or inflexible or weird. Just bad at delegating. Really.) My last two RAs, with whom I am certainly still good friends, both ended up in different careers – one worked for me for five years and then went to medical school in New York, the next one worked for me for three years and then went to acting school at the American Repertory Theater at Harvard. So it is not a total show-stopper to work in my lab …

Please forward the message/ad to anyone you know who might be interested in applying for the position.


David Poeppel's lab in the psychology department at NYU is currently hiring a lab manager/lab coordinator/research assistant who will help organize various aspects of research. The lab is focused on auditory cognition, speech perception, and language comprehension (Center for Language, Speech, and Hearing – ClaSH@NYU). The position -- at the main NYU campus at Washington Square in Greenwich Village in New York -- entails a wide variety of responsibilities, from the banal to the beautiful: running subjects (MEG, fMRI, EEG, psychophysics), helping with data analysis, participant recruitment, scheduling, handling all work related to IRB submissions/reviews, public relations, ordering and maintaining equipment, assistance with grants preparation and management, telling the PI and the post-docs and students what to do next, and so on.

Qualifications. Applicants must be computer literate and have (at least) a bachelor's degree in psychology, linguistics, neuroscience, biology, engineering, computer science, or some related discipline. Matlab competence would be an asset, as would be the willingness to learn and try new quantitative techniques. Most importantly, the job requires being excited about cognitive neuroscience research -- and particularly language, speech, and hearing research -- having excellent communication skills, having both tolerance and a robust sense of humor, and generally being street smart and quick and flexible.

Editorially inclined candidates have an additional opportunity: as of now, David Poeppel is editing the cognitive neuroscience of language section of the journal Language and Cognitive Processes. There is additional funding available to support the person helping with the editorial work, which requires all aspects of work related to evaluating and publishing scientific manuscripts, including communicating with authors and reviewers.

The position is available immediately. Interested candidates should send (i) a letter outlining their background and interests, (ii) their resume/CV, and (iii) the names and contact information for three references to david.poeppel@nyu.edu. Questions about the position are welcome.

Thursday, January 8, 2009

Functional organization of the planum temporale

This is the title of a talk I'm giving at the Auditory Cognitive Neuroscience Society meeting tomorrow in Tucson. What I'm going to argue is that there is no such thing. Let me explain...

The planum temporale is a gross anatomical feature. Although it is often referred to and studied as a functional region -- e.g., The Planum Temporale as a Computational Hub (Griffiths and Warren, 2002) among many other papers -- there is no evidence to support this view. Cytoarchitectonic data indicate at least four distinct fields within the PT that themselves do not respect the anatomical boundaries of the region; i.e., the cytoarchitectonic areas extend beyond the PT to include the lateral STG, parietal operculum, and supramarginal gyrus. Further, although the PT is often referred to as "auditory cortex" only the anterior portion of it appears to be auditory cortex proper.

In other words, the PT is not a functionally unitary region, and we should stop trying to characterize it as such.

Griffiths and Warren (2002) noticed correctly that a range of different types of stimuli can activate the PT. To account for this observation they proposed that the PT functions as a computational hub, which they envision as a kind of pattern matcher/router: various kinds of input come in and get sorted and routed to the appropriate processing systems. I've looked closely at two functions that have consistently implicated the PT region, spatial hearing and sensory-motor integration. It turns out that if you look at the activation maps associated with these two functions on a within subject basis, they lite up distinct areas of the PT region. Spatial hearing-related activations (listening to sounds coming from a variety of spatial locations > from a single location) activate a more anterior location that is likely within the auditory cortex portion of the PT and the sensory motor activations (regions that activate both during the perception and covert production of speech) are more posterior, likely within the non-auditory regions of the PT. This kind of result provides further evidence for a functionally heterogenous PT, and argues against hypotheses like the Griffith and Warren's computational hub.

T Griffiths, J Warren (2002). The planum temporale as a computational hub Trends in Neurosciences, 25 (7), 348-353 DOI: 10.1016/S0166-2236(02)02191-4