An interesting new study by Tsunda, Lee, and Cohen (2011) has found that rhesus monkeys show categorical perception of a speech sound continuum (dad to bad) and further that the population response of neurons in anterior lateral belt region of auditory cortex appears to reflect the categories. However, the average activity of the auditory cortex cells did not predict response choice. A previous study from the Cohen lab (Russ et al. 2008) found that neurons in ventral prefrontal cortex did correlate with the monkey's behavioral response in a similar speech discrimination task.
So what have we learned? We have yet more evidence that you don't need a motor speech system to perform well on a subtle speech perception task involving minimal pair place of articulation contrasts (/b/ vs. /d/). We can add monkeys to the list of critters who can do it. Second we learned that auditory cortex seems to code the categories, at least in the population response. The decision in such tasks, however, is not read off of the auditory response directly, but is mediated by prefrontal regions. This fits will with human stroke and imaging data suggesting a similar division of labor: auditory-related areas code speech categories while frontal regions are critical for task-related decision making, at least of these sorts of tasks.
This set of papers is definitely worth a look...
Russ, B., Orr, L., & Cohen, Y. (2008). Prefrontal Neurons Predict Choices during an Auditory Same-Different Task Current Biology, 18 (19), 1483-1488 DOI: 10.1016/j.cub.2008.08.054
Tsunada, J., Lee, J., & Cohen, Y. (2011). Representation of speech categories in the primate auditory cortex Journal of Neurophysiology, 105 (6), 2634-2646 DOI: 10.1152/jn.00037.2011
What does it mean for the neuroscience of language that various mammals and birds are able to categorize speech sounds? If in place of the spectrum from bad to dad the authors employed a spectrum of ringing tones, would the monkeys categorize the same way?
Post a Comment