|M.Sc Student||Gebert Ella|
|Subject||Virtual Acoustic Space Technique for the Study of|
Auditory Motion Processing in the Barn Owls:
Setup and Evaluation
|Department||Department of Medicine||Supervisor||Professor Yoram Gutfreund|
|Full Thesis text|
The ability to accurately determine the location of a moving sound source is essential to the survival of animals, by hunting or by fleeing. Barn owls are known for their remarkable ability to localize objects by sound alone. In this thesis, we aim to study the neural responses of barn owls to moving sounds.
When processing a moving stimulus, the complexity increases, comparing to a stationary stimulus. A representation of the object is needed both in time and space. It was also shown that in some cases, an object is better detected when it moves. Examples of response patterns unique to motion are receptive field shift and selectivity to direction.
In this study I developed a method to present and study auditory motion in barn owls. The barn owl's known neuronal centers for auditory localization, involve the inferior collicullus and the optic tectum, in the midbrain, and Field L, Entopallium and Arcopallium in the forebrain. When going up in the auditory processing chain, the external nucleus of the inferior collicullus (ICX) is the first neuronal center in which we find a 2D map of space, where neurons respond with sharp sensitivity curves to specific sound source directions.
One of the main challenges in studying auditory motion is controlling the sensory scene and providing accurate and clean auditory cues. Some of the techniques to present moving sounds include apparent motion by an array of remote speakers, convolution with HRIR interpolation, ITD sweeps, frequency modulated sweeps, and the use of real motion.
In this work, I simulated moving sounds through earphones. The played sounds are based on recordings of free-field auditory moving stimulus. This kind of natural-like stimulus (not created by a collection of discrete locations or gradual shifts) can lead to more reliable conclusions regarding the location. I implemented this by recording a sound source in real motion, and processing these recordings in order to use them as playbacks in the ear-phones.
I evaluated the performance of the setup in two ways: At first, by analyzing the auditory recordings to expose their inherent localization cues, and secondly, by taking neuronal extracellular recordings from the ICX of the barn owls, and verifying whether the responses matched our intentional auditory cue. Possible future applications for this system may include studying different localization cues and their importance in the processing of motion, multimodal integration, auditory illusions regarding the perceived location of the sound source, perception of location when combining a number of moving stimuli and more.