|M.Sc Student||Nae Yael|
|Subject||Utilizing Virtual Space Techniques to Study Neuronal|
Processing of Moving Audio-Visual Stimuli in the
|Department||Department of Medicine||Supervisor||Professor Yoram Gutfreund|
|Full Thesis text|
In order to attain coherent global perception, the brain must integrate a wide range of sensory information. Interactions between sensory modalities influence perception, perceptual ambiguities and behavior. Alongside the known knowledge of psychophysics of such interactions, there is still much to explore in terms of multisensory physiology. Recent research suggests that integration within single multimodal neurons may play a crucial role in multisensory integration. One major challenge is to reveal the pathways and sites in the brain where multisensory information converges and to characterize the neural computations that underlie cross-modal perceptual integration. While in recent years there has been a growing use of complex and fairly natural stimulation paradigms to study neural processing of sensory information, this trend has been mostly limited to the study of uni-modal systems. Motion is a complex, natural multisensory stimulus, and it is one of the most important cues for survival. The use of motion stimulus for the study of audio-visual integration has several advantages; it has been shown that perceptual binding of audio-visual stimuli is enhanced by motion and provides an additional dimension of congruency - direction. Although mechanisms of uni-sensory response to motion have been previously studied, the physiological principles underlying the representation of multisensory motion remain vague.
In this project, we have utilized HRTF-based virtual acoustic space techniques in which sound is presented to the barn owl via earphones, imitating the free-field auditory space, yet still allowing for simultaneous visual stimuli presentation and synchronized recording of spike activity. Using this technique we studied the physiological response of multimodal neurons in the barn owl’s Optic Tectum (homologue of mammalian's Superior Colliculus) to uni- and bi-modal moving stimuli in horizontal and vertical directions. The HRTF-based system as implemented provides reliable simulation of auditory space, as verified by the comparison of the HRTF-based acoustic receptive field in response to static stimuli with that obtained in response to binaural cues. Presenting moving horizontal stimuli in auditory and visual uni-modal tests resulted in a shift of the preferred azimuth towards the origin of the motion. Such a shift has been previously shown in barn owls and it has been suggested that this shift provides compensation for motoric delay, navigating the nervous system into the expected behavioral responses. We conducted some bi-modal motion tests. Although only initial results were obtained, which were somewhat inconclusive, the experience gained from these trials and the methodology employed can be utilized toward further bi-modal motion studies.