Tuesday, November 27, 2012

Language and the Motor System - Editorial

And another quote from the editorial:

phonological features of speech sounds are reflected in motor cortex activation so that the action system likely plays a double role, both in programming articulations and in contributing to the analysis of speech sounds (Pulvermuller et al., 2006)
which explains why prelingual infants, individuals with massive strokes affecting the motor speech system, individuals undergoing Wada procedures with acute and complete deactivation of the motor speech system, individuals with cerebral palsy who never acquired the ability to control their motor speech system, and chinchilla and quail can all perceive speech quite impressively.

One of the most frequently cited brain models of language indeed still sees a role of the motor system limited to articulation, thus paralleling indeed the position held by classical aphasiologists, such as Wernicke, Lichtheim and especially Paul Marie (Poeppel and Hickok, 2004). Recently, a contribution to speech comprehension and understanding is acknowledged insofar as inferior frontal cortex may act as a phonological short-term memory resource (Rogalsky and Hickok, 2011). These traditional positions are also discussed in the present volume, along with modern action-perception models.
Good hear we will get the "traditional" perspective.  David, did you ever think WE would be called "traditional"?  Nice to see that our previously radical views are now the standard theory.

Let's try turning the tables:

One of the most frequently cited brain models of speech perception indeed still sees the motor system as playing a critical role, thus paralleling indeed the position held by classical speech scientists of the 1950s such as Liberman and even the early 20th century behaviorists such as Watson (Pulvermuller et al. 2006).

Moreover, one of the most frequently cited brain models of conceptual representation indeed still sees sensory and motor systems as being the primary substrate thus paralleling indeed the position held by classical aphasiologists, such as Wernicke and Lichtheim (Pulvermuller et al. 2006).


Anonymous said...

My congratulations to you, Greg, for joining the classics!

Yes, we know that IFG is not necessary for speech perception. But nevertheless, it seems to be automatic in typical subjects. M. Coltheart once noted that automaticity works not only when it’s not required for the task in question but also even if it’s harmful to the task (e.g. in the Stroop task). So can't it be like the question is not whether IFG is involved in speech perception but under which circumstances it is useful and if, under some circumstances, it isn't even harmful?

BTW, a fascinating finding for me is that people are better at reading their own rather than others’ lips (Tye-Murray et al., 2012, Reading your own lips: Common-coding theory and visual speech perception Psychon Bull Rev. DOI 10.3758/s13423-012-0328-5). How's that one is able to better lipread one's own motor program than other people's, even though one has hardly ever watched him/herself speeking? (a kind of poverty of stimulus)

Anonymous said...

In which respects do you consider your model to be "radical"? I would consider it a modern version of the Wernicke-Lichtheim-Geschwind model... you don't see it that way?

Greg Hickok said...

Well our 2000 paper was initially rejected at Neuron for being too controversial. Primarily this was because we claimed that speech perception was bilateral.