Wednesday, April 23, 2008

Monkey Lip Reading


First it was Broca's area in the chimp, now there is a new study examining audiovisual integration in the perception of monkey vocalizations.

The study by Ghazanfar, Chandrasekaran, & Logothetis (J. Neurosci. 2008, 28:4457-69) recorded single units as well as local field potentials in the STS and in auditory cortex of macaque monkeys. They report that responses in auditory cortex (lateral belt regions) are influenced by visual inputs from the STS. (Abstract below)

This looks like a pretty nice study that provides direct evidence for multisensory integration in belt areas of auditory cortex. This may not be the only source of input to these multisensory cells in the lateral belt region. In humans, lip reading activates a large network that include frontal regions. Feedback projections from motor-speech areas may also influence responses in auditory cortex (at least in humans) as mentioned previously.



Interactions between the Superior Temporal Sulcus and Auditory Cortex Mediate Dynamic Face/Voice Integration in Rhesus Monkeys

Asif A. Ghazanfar,1,2 Chandramouli Chandrasekaran,1 and Nikos K. Logothetis2

1Neuroscience Institute and Department of Psychology, Princeton University, Princeton, New Jersey 08540, and 2Max Planck Institute for Biological Cybernetics, 72076 Tuebingen, Germany

Correspondence should be addressed to Asif A. Ghazanfar, Neuroscience Institute and Department of Psychology, Green Hall, Princeton University, Princeton, NJ 08540. Email: asifg@princeton.edu

The existence of multiple nodes in the cortical network that integrate faces and voices suggests that they may be interacting and influencing each other during communication. To test the hypothesis that multisensory responses in auditory cortex are influenced by visual inputs from the superior temporal sulcus (STS), an association area, we recorded local field potentials and single neurons from both structures concurrently in monkeys. The functional interactions between the auditory cortex and the STS, as measured by spectral analyses, increased in strength during presentations of dynamic faces and voices relative to either communication signal alone. These interactions were not solely modulations of response strength, because the phase relationships were significantly less variable in the multisensory condition as well. A similar analysis of functional interactions within the auditory cortex revealed no similar interactions as a function of stimulus condition, nor did a control condition in which the dynamic face was replaced with a dynamic disk mimicking mouth movements. Single neuron data revealed that these intercortical interactions were reflected in the spiking output of auditory cortex and that such spiking output was coordinated with oscillations in the STS. The vast majority of single neurons that were responsive to voices showed integrative responses when faces, but not control stimuli, were presented in conjunction. Our data suggest that the integration of faces and voices is mediated at least in part by neuronal cooperation between auditory cortex and the STS and that interactions between these structures are a fast and efficient way of dealing with the multisensory communication signals.

2 comments:

Anonymous said...

It seems that we get audio-visual associations between this and that auditory feature and its visual counterpart at the lowest level that has this feature represented.

Brosch et al (2005) have studied monkeys heavily trained in the simple auditory task, that had tone stimuli and which, importantly, was cued by light. In the primary auditory cortex he found auditory cells that were responding to the light.

"Nonauditory Events of a Behavioral Procedure Activate Auditory Cortex of Highly Trained Monkeys"

Michael Brosch, Elena Selezneva, and Henning Scheich

A central tenet in brain research is that early sensory cortex is modality specific, and, only in exceptional cases, such as deaf and blind subjects or professional musicians, is influenced by other modalities. Here we describe extensive cross-modal activation in the auditory cortex of two monkeys while they performed a demanding auditory categorization task: after a cue light was turned on, monkeys could initiate a tone sequence by touching a bar and then earn a reward by releasing the bar on occurrence of a falling frequency contour in the sequence. In their primary auditory cortex and posterior belt areas, we found many acoustically responsive neurons whose firing was synchronized to the cue light or to the touch or release of the bar. Of 315 multiunits, 45 exhibited cue light-related firing, 194 exhibited firing that was related to bar touch, and 268 exhibited firing that was related to bar release. Among 60 single units, we found one neuron with cue light-related firing, 21 with bar touch-related firing, and 36 with release-related firing. This firing disappeared at individual sites when the monkeys performed a visual detection task. Our findings corroborate and extend recent findings on cross-modal activation in the auditory cortex and suggests that the auditory cortex can be activated by visual and somatosensory stimulation and by movements. We speculate that the multimodal corepresentation in the auditory cortex has arisen from the intensive practice of the subjects with the behavioral procedure and that it facilitates the performance of audiomotor tasks in proficient subjects.

Greg Hickok said...

Definitely blurs the distinction between unimodal and multisensory cortex. My Irvine colleague Ron Frostig has done work in rodents making the same sort of point. Thanks Daniel!