|M.Sc Student||Birnboim Irit|
|Subject||Eearly Cortical Bi-and Tri-Modal Interactions are Evoked|
in a Virtual Reality World
|Department||Department of Medicine||Supervisors||Professor Emeritus Hillel Pratt|
|Professor Miriam Reiner|
|Full Thesis text|
Multimodal cues constitute the majority of our natural sensory inputs. Consequently, the brain is adjusted to process and integrate information from multiple modalities. But how is the information from the different and, seemingly separated, sensory processes integrated? It has been shown that many cortical areas take part in such interactions from early stages of processing. Our goal was to investigate the early stages of everyday multimodal perception. Specifically, we wished to find when and where do multimodal interactions occur in the cortex.
A Virtual Reality (VR) platform was chosen for delivery of auditory, haptic and 3D visual cues. In order to create a pseudo-real experience, an interactive game was designed and implemented on the VR system. Participants detected and responded to simple natural uni-, bi- and tri-modal cues in order to save a virtual treasure. Behavioral responses and electroencephalogram (EEG) were recorded and the cortical generators of the recorded activity were estimated. Analyses of multimodal interactions were based on comparisons between multimodal and unimodal cues.
Behavioral results revealed that tri-modal cues were detected more often and processed faster than bi-modal cues, which, in turn, exhibited the same advantage over unimodal cues. Thus, in the current setting, perception of multimodal cues was enhanced compared to unimodal cues. Our main electrophysiological results revealed multiple sites of multimodal integration, both in multimodal and in traditionally unimodal cortices, from very early stages of processing (~30millisecond). Interactive multimodal regions included the lateral prefrontal lobes, the medial frontal gyri, the temporal lobes, the anterior cingulate and the insulas. Multimodal interactive "unimodal" regions were found in the associative somatosensory cortex and in the associative visual cortex (the lingual cortex, the cuneus and the fusiform gyrus).
Similarities between current and previous findings were elaborated, specifically addressing disparities in results. A review of types of connectivity between cortical regions revealed that direct feed-forward connections between unimodal areas, as well as feedback and feed-forward connections between multimodal and unimodal brain regions, plausibly participated in the current early multimodal interactions. In addition, an explanation for the sub-additive nature of the results was offered, according to which sub-processes that are active in a certain cortical region during sensory perception are completed earlier in response to multimodal compared to unimodal stimuli. Finally, improvements for the current setting, and possible applications for this type of research in medicine and in the field of VR were offered.