|M.Sc Student||Lelchouk Yana|
|Subject||Gesture-Driven Human-Computer Interface|
|Department||Department of Mechanical Engineering||Supervisors||Professor Miriam Zacksenhouse|
|Professor Emeritus Moshe Shpitalni|
|Full Thesis text|
Human hand movements’ recognition has the potential to be a natural and a powerful tool for developing and supporting intuitive human-computer interaction in a variety of applications, including the analysis of complex scientific data, medical training, military simulation and virtual prototyping.
For development of human-computer interaction it is important to investigate the fundamentals and basic techniques of the human-human communication and to take into consideration the technological aspects of the human-computer dialogue. Hence, this research emphasizes the importance of considering both the computerized system and the human operator in developing successful interfaces. In this work, I develop a simple and dynamic language, compare between different types of gestures and postures and optimize their recognition. I present a hand-driven human-computer natural interface based on an intuitive language, as an alternative to a traditional keyboard and a mouse. The unique aspect of the interface is that it includes both dynamic gestures and static postures. This combination enables development of an expressive and figurative language. The resulting language facilitates the interaction with the computer and the operation of a complicated system.
Similar to the real sign language, my language includes words, phrases, sentences and grammatical rules. The language definitions and design rely on linguistic, cognitive, perceptual, human comfort and computational factors. Language implementation relies on meaningful feature extraction and data reduction using Principal Components Analysis and on pattern recognition using neural networks. It has been shown that Fuzzy ARTMAP provides better recognition rates and is less sensitive to the size of the training set, than back-propagation networks. The interface has been implemented and tested in virtual environment and robotic system applications.