Audiovisual perception: Implicit estimation of sound-arrival time 27 February 2003 Nature 421, 911 (2003); doi:10.1038/421911a Audiovisual perception: Implicit estimation of sound- arrival time In perceiving the sound produced by the movement of a visible object, the brain coordinates the auditory and visual input1-3 so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less effective). Here we show that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity. Our findings indicate that auditory and visual inputs are coordinated not because the brain has a wide temporal window for auditory integration, as was previously thought, but because the brain actively changes the temporal location of the window depending on the distance of the visible sound source. Seven subjects with normal vision and hearing were presented through headphones with a burst of white noise (90 decibels sound-pressure level, 10-ms duration, with 4-ms rise and fall times), the spectrum of which had been processed (by using head-related transfer functions) to simulate an external sound from a frontal direction. Brief light flashes (10 ms) were produced by an array of five green light-emitting diodes (LEDs) at different distances from the subjects (1­50 m; Fig. 1). The intensity of the light flash was 14.5 candelas per square metre at a viewing distance of 1 m, and was increased in proportion to the square of the viewing distance for the other distances in order to produce consistent intensity at the eye. The difference in onset times between the sound and light stimuli was varied randomly from -125 ms to 175 ms in steps of 25 ms. Figure 1 Synchrony in audiovisual perception. Full legend High resolution image and legend (53k) Subjects were instructed to look at the centre of the LED array and to imagine that the LEDs were the source of both light and sound, while listening to the sound directly from the sound source. To eliminate possible bias effects, we used a two-alternative forced-choice task to measure subjective simultaneity: in this task, observers judged whether the light was presented before or after the sound. file:///D|/Dokumenty/PŘEDNÁŠKY/Kapitoly z Neurofyziologie/č...eriály k diskusi/sound-arrival time_soubory/DynaPage(1).htm (1 z 2) [8.4.2005 12:58:19] Audiovisual perception: Implicit estimation of sound-arrival time Twenty responses were obtained for each condition. To determine the stimulus-onset asynchrony that corresponded to subjective simultaneity, we estimated the 50% point (the point of subjective equality) by fitting a cumulative normal-distribution function to each individual's data using a maximum- likelihood curve-fitting technique. When the LED array was 1 m away, the point of subjective equality occurred at a sound delay of about 5 ms; however, the sound delay at this point increased with viewing distance (P < 0.001; Fig. 1a, b). This increased delay was roughly consistent with the velocity of sound (about 1 m per 3 ms at sea level and room temperature), so the point of subjective equality increased by about 3 ms with each 1-m increase in distance. This relationship was consistent at least up to a distance of 10 m. Our results show that the brain probably takes sound velocity into account when judging simultaneity. However, it takes about 120 ms for sound to travel 40 m, and we found that the threshold for detecting the sound delay was 106 ms at a viewing distance of 40 m, so active compensation is likely to operate only for shorter distances than this. We have shown that the brain takes sound velocity into account when integrating audiovisual information. The brain can therefore integrate audiovisual information over a wide range of temporal gaps, and correctly match sound and visual sources. YOICHI SUGITA* AND YÔITI SUZUKI * National Institute of Advanced Industrial Science and Technology, Neuroscience Research Institute, Teragu-hakusan 1497-1, Tsukuba 300-4201, Japan Research Institute of Electrical Communication and Graduate School of Information Sciences, Tohoku University, Katahira, Aoba-ku, Sendai 980-8577, Japan e-mail: y.sugita@aist.go.jp References 1. Sekuler, R., Sekuler, A. B. & Lau, R. Nature 385, 308 (1997). | PubMed | 2. McDonald, J. J., Teder-Sälejärvi, W. A. & Hillyard, S. A. Nature 407, 906-908 (2000). | Article | PubMed | 3. Shams, L., Kamitani, Y. & Shimojo, S. Nature 408, 788 (2000). | Article | PubMed | Competing financial interests: declared none. Nature Macmillan Publishers Ltd 2003 Registered No. 785998 England. file:///D|/Dokumenty/PŘEDNÁŠKY/Kapitoly z Neurofyziologie/č...eriály k diskusi/sound-arrival time_soubory/DynaPage(1).htm (2 z 2) [8.4.2005 12:58:19]