January 25, 2005 | Virginie van Wassenhove*, Ken W. Grant†, and David Poeppel**††
The study by van Wassenhove, Grant, and Poeppel investigates the neural processing of auditory speech when visual speech is present. They found that visual speech speeds up the cortical processing of auditory signals within 100 ms of signal onset, reflecting an articulator-specific temporal facilitation and a non-specific amplitude reduction. The latency facilitation depends on the degree to which the visual signal predicts possible auditory targets. These findings support the existence of abstract internal representations that constrain the analysis of subsequent speech inputs, suggesting an "analysis-by-synthesis" mechanism in auditory-visual speech perception. The study also highlights the importance of natural, ecologically valid stimulation in understanding multisensory integration, particularly in the context of speech perception.The study by van Wassenhove, Grant, and Poeppel investigates the neural processing of auditory speech when visual speech is present. They found that visual speech speeds up the cortical processing of auditory signals within 100 ms of signal onset, reflecting an articulator-specific temporal facilitation and a non-specific amplitude reduction. The latency facilitation depends on the degree to which the visual signal predicts possible auditory targets. These findings support the existence of abstract internal representations that constrain the analysis of subsequent speech inputs, suggesting an "analysis-by-synthesis" mechanism in auditory-visual speech perception. The study also highlights the importance of natural, ecologically valid stimulation in understanding multisensory integration, particularly in the context of speech perception.