Thursday, May 27, 2010

Syntax found in the brain -- not in Broca's area

Syntactic processing and Broca's area have been cozy bedfellows ever since work in the 1970s showed that patients with Broca's aphasia had difficulty comprehending syntactically complex sentences. Despite the fact that further lesion based evidence severely weakened the relationship (e.g., Broca's aphasics are pretty good at grammaticality judgments) subsequent PET and fMRI studies prolonged the marriage by showing that the comprehension of complex sentences activates Broca's area more than simple sentences. These days a look at the literature on Broca's area and sentence processing is full of controversy with proposals ranging from the region supporting syntactic movement processing (Grodzinsky) hierarchical structure and phrase structure building (Friederici), "linearization" (Bornkessel-Schlesewsky), and several more domain-general functions such as cognitive control (Novick), "unification" (Hagoort) and working memory (Caplan; Rogalsky/Hickok).

Meanwhile, evidence has been accumulating that the anterior temporal lobe may house a network that behaves much more like a syntactic computation system in that it seems to be highly correlated with the presence or absence of syntactic information in a sentence (Dronkers, et al. 2004; Friederici, et al., 2000; Humphries, et al. 2005,2006; Mazoyer et al., 1993; Rogalsky & Hickok, 2008; Stowe et al., 1998; Vandenberghe, Nobre, & Price, 2002).

A new study in Brain and Language weighs in on the issue using the psycholinguists' favorite work of literature, Alice in Wonderland. Unlike most psycholinguistic nods to Lewis Carroll (a.k.a. Charles Lutwidge Dodgson), the study by Jonathan Brennan and colleagues did not use Jabberwocky sentences, but instead had subjects listen to text from Alice while chillin' in a giant magnetic donut. The authors calculated word-by-word the amount of syntactic structure that was involved in integrating each word (basically a syntactic tree node counting analysis). These values were then correlated with the fMRI signal.

So what brain region correlated with syntactic structure? You guessed it: the anterior temporal lobe.

References

Brennan, J., Nir, Y., Hasson, U., Malach, R., Heeger, D., & Pylkkänen, L. (2010). Syntactic structure building in the anterior temporal lobe during natural story listening Brain and Language DOI: 10.1016/j.bandl.2010.04.002

Bornkessel-Schlesewsky I, Schlesewsky M, von Cramon DY. 2009. Word order and Broca's region: evidence for a supra-syntactic perspective. Brain Lang. 111:125-139.

Caplan D, Alpert N, Waters G, Olivieri A. 2000. Activation of Broca's area by syntactic processing under conditions of concurrent articulation. Hum Brain Mapp. 9:65-71.

Friederici AD, Meyer M, von Cramon DY. 2000. Auditory languge comprehension: An event-related fMRI study on the processing of syntactic and lexical information. Brain and Language. 74:289-300.

Grodzinsky, Y. (2001). The neurology of syntax: Language use without Broca’s area. Behavioral and Brain Sciences, 23(01), 1–21.

Hagoort P. 2005. On Broca, brain, and binding: a new framework. Trends Cogn Sci. 9:416-423.

Humphries C, Binder JR, Medler DA, Liebenthal E. 2006. Syntactic and semantic modulation of neural activity during auditory sentence comprehension. J Cogn Neurosci. 18:665-679.

Humphries C, Love T, Swinney D, Hickok G. 2005. Response of anterior temporal cortex to syntactic and prosodic manipulations during sentence processing. Human Brain Mapping. 26:128-138.

Mazoyer BM, Tzourio N, Frak V, Syrota A, Murayama N, Levrier O, Salamon G, Dehaene S, Cohen L, Mehler J. 1993. The cortical representation of speech. Journal of Cognitive Neuroscience. 5:467-479.

Novick JM, Trueswell JC, Thompson-Schill SL. 2005. Cognitive control and parsing: reexamining the role of Broca's area in sentence comprehension. Cogn Affect Behav Neurosci. 5:263-281.

Rogalsky C, Matchin W, Hickok G. 2008. Broca's Area, Sentence Comprehension, and Working Memory: An fMRI Study. Front Hum Neurosci. 2:14.

Rogalsky C, Hickok G. 2009. Selective Attention to Semantic and Syntactic Features Modulates Sentence Processing Networks in Anterior Temporal Cortex. Cereb Cortex. 19:786-796.

Stowe LA, Broere CA, Paans AM, Wijers AA, Mulder G, Vaalburg W, Zwarts F. 1998. Localizing components of a complex task: sentence processing and working memory. Neuroreport. 9:2995-2999.

Vandenberghe R, Nobre AC, Price CJ. 2002. The response of left temporal cortex to sentences. J Cogn Neurosci. 14:550-560.

Tuesday, May 25, 2010

Neurobiology of Language Conference Submission deadline approaching

A reminder that abstract submission for the second Neurobiology of Language Conference (NLC 2010) will be closing in one week, on Tuesday, June 1st at midnight (CST)! To submit an abstract, visit our website at http://www.neurolang.org. Don’t miss this opportunity to share your research with the neurobiology of language scientific community!

NLC 2010 will be held on November 11-12 2010 in San Diego as a satellite of the Society for Neuroscience annual meeting. A reminder that you do not need to attend the Society for Neuroscience (SfN) annual meeting, or be a member of SfN, to attend NLC. If you are planning to attend SfN, however, please note that SfN regulations allow individuals to present their SfN abstracts during satellite events.


We look forward to seeing you in San Diego!

Sincerely,


Pascale Tremblay, Ph.D., Postdoctoral Scholar, The University of Chicago
Steven L. Small, Ph.D., M.D., Professor, The University of Chicago



The Neurobiology of Language Planning Group:
Michael Arbib, Ph.D., University of Southern California, USA
Jeffrey Binder, M.D., Medical College of Wisconsin, USA
Vincent Gracco, Ph.D., McGill University, Canada
Yosef Grodzinsky, Ph.D., McGill University, Canada
Murray Grossman, M.D., Ed.D., University of Pennsylvania, USA
Peter Hagoort, Ph.D., Max Planck Institute, Netherlands
Gregory Hickok, Ph.D., University of California, Irvine, USA
Marta Kutas, Ph.D., The University of California, San Diego, USA
Alec Marantz, Ph.D., New York University, USA
Howard Nusbaum, Ph.D., The University of Chicago, USA
Cathy Price, Ph.D., University College London, UK
David Poeppel, Ph.D., New York University, USA
Rita Salmelin, Ph.D., Helsinki University of Technology, Finland
Kunioshi Sakai, Ph.D., Tokyo University, Japan
Steven L. Small, Ph.D, M.D., The University of Chicago, USA
Sharon Thompson-Schill, University of Pennsylvania, USA
Pascale Tremblay, Ph.D., The University of Chicago, USA
Richard Wise, M.D., Ph.D, Imperial College, London, UK
Kate Watkins, Ph.D., University of Oxford, UK

Three years of Talking Brains

After three years of Talking Brains we've made a couple of small updates to the blog.

One change is the addition of "ANNOUNCEMENTS" and "JOB POSTINGS" subsections which you can find at the top of the page. Click on the link and you will see recent posts in these categories. The number of people send us conference/call for paper announcements and job postings has increased substantially in the last year or so. We are happy to post them for you. With between 6-10k hits a month lately, you will get reasonable visibility.

We've also changed the domain name to www.talkingbrains.org (the old address will also still work).

Thanks for reading what we have to say and thanks especially to those who contribute via comments and guest posts. We certainly welcome any and all participation as the goal of the blog is to promote discussion of issues surrounding the neuroscience of language.

-greg

Friday, May 21, 2010

Dissociation of mirror system activity and action understanding: Evidence from sign language

Sign language is arguably an ideal system to study in the context of the mirror neuron theory of action understanding, particularly its relation to language: you've got overt manual gestures, not those pesky obscured gestures associated with speech, and you have the ability to study the relation between action understanding in linguistic (pantomime) and nonlinguistic (sign language) domains. The latter is particularly interesting given recent claims that speech evolved from a manual gesture system (e.g., Rizzolatti & Arbib, 1998; Corballis, 2010). To date, evidence from the sign language literature has been less than supportive of the role for the mirror system in action understanding (Corina & Knapp, 2008; Knapp & Corina, 2010).

A recent study by Karen Emmorey and colleagues continues this non-supportive trend. Deaf signers and hearing non-signers were studied using fMRI during the perception of non-linguistic gestures (pantomimes) and linguistic gestures (American Sign Language verbs). Behaviorally of course, both types of gestures are meaningful to signers but only the pantomimes were meaning to hearing non-signers.



The findings were unexpected from the perspective of the mirror neuron theory of action understanding, at least with the deaf group. The hearing subjects showed activation in the expected visual-related areas in the ventral occipito-temporal region as well as in the fronto-parietal "mirror system". This was true both for meaningful (pantomimes) and non-meaningful (ASL verbs for the hearing group) stimuli. So "understanding" isn't what's driving the mirror system -- but we knew that already from previous work on viewing meaningless gestures. Surprisingly, the deaf signers did not activate the mirror system during the perception of pantomimes at all, and only in a small focus in Broca's area during the perception of ASL verbs. Comprehension performance on pantomimes assessed after the scan was equivalent for deaf and hearing groups.



It is unclear to me why the two groups should differ so dramatically, but it is clear that you don't need to activate the "mirror system" to understand actions. Emmorey et al. state it succinctly:

We conclude that the lack of activation within the MNS for deaf signers does not support an account of human communication that depends upon automatic sensorimotor resonance between perception and action.


References


Corballis, M. (2010). Mirror neurons and the evolution of language Brain and Language, 112 (1), 25-35 DOI: 10.1016/j.bandl.2009.02.002

Corina DP, & Knapp HP (2008). Signed language and human action processing: evidence for functional constraints on the human mirror-neuron system. Annals of the New York Academy of Sciences, 1145, 100-12 PMID: 19076392

Emmorey K, Xu J, Gannon P, Goldin-Meadow S, & Braun A (2010). CNS activation and regional connectivity during pantomime observation: no engagement of the mirror neuron system for deaf signers. NeuroImage, 49 (1), 994-1005 PMID: 19679192

Knapp HP, & Corina DP (2010). A human mirror neuron system for language: Perspectives from signed languages of the deaf. Brain and language, 112 (1), 36-43 PMID: 19576628

Rizzolatti, G., & Arbib, M. (1998). Language within our grasp Trends in Neurosciences, 21 (5), 188-194 DOI: 10.1016/S0166-2236(98)01260-0

Thursday, May 6, 2010

Gesture discrimination deficits implicate temporal-parietal regions not the "mirror system"

A new study in J. Neuroscience failed to replicate a previous finding published in the same journal that linked gesture discrimination deficits to tissue loss in the inferior frontal gyrus, part of the supposed human mirror system. The new study, by Nelissen et al., examined correlations between gesture discrimination (and a range of other language and non-language tasks) and patterns of tissue degeneration in primary progressive aphasia (PPA). They found that gesture discrimination did not correlate with tissue loss in the inferior frontal gyrus, and instead correlated with tissue loss in posterior temporal-parietal regions including portions of the superior temporal gyrus, which is not part of the human "mirror system" (see below which is Fig 2A from Nelissen et al.).


This is from the same group that published the previous stroke-based study that reported an association between tissue damage in the inferior frontal gyrus (part of the "mirror system") and gesture discrimination deficits (see my critique of that study here, or in Hickok, 2009). The same stimuli were used in both studies, so why the difference?

I had pointed out that the stroke study was potentially biased because they used percent correct for their gesture discrimination-brain lesion correlation rather than d' (d-prime), which corrects for response bias. The previous study used a yes/no response paradigm: subjects saw a gesture and had to decide whether it was correctly executed or not. Using percent correct is a problem because some subjects (perhaps as a function of their lesion!) may be biased toward yes or no responses which can skew the results independently of how well they are actually discriminating the gestures. The present study used the same stimuli but with a modified design, 3-alternative forced choice: subjects viewed three gestures and then had to decide which of the three was correctly executed. A still frame of the gesture was left on the screen to minimize working memory effects. This paradigm reduces bias especially when the order and position of the correct item is counterbalanced as was done by Nelissen et al. It appears then, that reducing the bias shifted the brain region that showed a correlation with gesture discrimination performance.

There is much more to this study than what I've highlighted here, including their finding that measures of gesture and language processing are highly correlated. But that's a topic for another blog entry.

References

Hickok G (2009). Eight problems for the mirror neuron theory of action understanding in monkeys and humans. Journal of cognitive neuroscience, 21 (7), 1229-43 PMID: 19199415

Nelissen, N., Pazzaglia, M., Vandenbulcke, M., Sunaert, S., Fannes, K., Dupont, P., Aglioti, S., & Vandenberghe, R. (2010). Gesture Discrimination in Primary Progressive Aphasia: The Intersection between Gesture and Language Processing Pathways Journal of Neuroscience, 30 (18), 6334-6341 DOI: 10.1523/JNEUROSCI.0321-10.2010

Monday, May 3, 2010

NLC2010: Abstract Submission is Now Open!

It is our great pleasure to announce that abstract submission for the second Neurobiology of Language Conference (NLC 2010) is now open! For more information, please visit our new website at http://www.neurolang.org or send us an email. Don’t miss this opportunity to share your research with the neurobiology of language scientific community! The abstract submission deadline is Tuesday, June 1st at midnight (CST).

Also note that early registration is now open: to take advantage of early registration rates, visit our website now.

NLC 2010 will be held on November 11-12 2010 in San Diego as a satellite of the Society for Neuroscience annual meeting. A reminder that you do not need to attend the Society for Neuroscience (SfN) annual meeting, or be a member of SfN, to attend NLC. If you are planning to attend SfN, however, please note that SfN regulations allow individuals to present their SfN abstracts during satellite events.


We look forward to seeing you in San Diego!


Kind regards,


Pascale Tremblay, Ph.D., Postdoctoral Scholar, The University of Chicago
Steven L. Small, Ph.D., M.D., Professor, The University of Chicago


The Neurobiology of Language Planning Group:

Jeffrey Binder, M.D., Medical College of Wisconsin, USA
Vincent Gracco, Ph.D., McGill University, Canada
Yosef Grodzinsky, Ph.D., McGill University, Canada
Murray Grossman, M.D., Ed.D., University of Pennsylvania, USA
Peter Hagoort, Ph.D., Max Planck Institute, Netherlands
Gregory Hickok, Ph.D., University of California, Irvine, USA
Marta Kutas, Ph.D., The University of California, San Diego, USA
Alec Marantz, Ph.D., New York University, USA
Howard Nusbaum, Ph.D., The University of Chicago, USA
Cathy Price, Ph.D., University College London, UK
David Poeppel, Ph.D., New York University, USA
Rita Salmelin, Ph.D., Helsinki University of Technology, Finland
Kunioshi Sakai, Ph.D., Tokyo University, Japan
Steven L. Small, Ph.D, M.D., The University of Chicago, USA
Sharon Thompson-Schill, University of Pennsylvania, USA
Pascale Tremblay, Ph.D., The University of Chicago, USA
Richard Wise, M.D., Ph.D, Imperial College, London, UK
Kate Watkins, Ph.D., University of Oxford, UK

Saturday, May 1, 2010

Post-doc and PhD scholarships: New Zealand Institute of Language, Brain, and Behaviour

From Megan McAuliffe:

My colleagues and I have recently received funding to form the New Zealand Institute of Language, Brain, and Behaviour. See http://www.nzilbb.canterbury.ac.nz/ for details.

As part of this, we are offering five postdoctoral fellow positions and a number of PhD scholarships. See full details at http://www.nzilbb.canterbury.ac.nz/fellowships.shtml. Could you please forward this information to anyone you think might be interested in these positions. I am more than happy to field any questions directly.

Many thanks
Megan



________________________________________
Megan McAuliffe PhD
Theme Coordinator, Ageing & Language
New Zealand Institute of Language, Brain and Behaviour
www.nzilbb.canterbury.ac.nz

Department of Communication Disorders
University of Canterbury | Private Bag 4800
Christchurch | New Zealand 8140
+64 3 364 2987 ext. 7075
megan.mcauliffe@canterbury.ac.nz
http://www.cmds.canterbury.ac.nz/people/mcauliffe.shtml