|M.Sc Student||Yael Lustig|
|Subject||Machine Learning Framework for Crowdsourced Natural|
|Department||Department of Industrial Engineering and Management||Supervisor||Full Professor Domshlak Carmel|
|Full Thesis text|
For more than a decade we are witnessing an explosion of non-textual information on the web, dominated by images and videos. Although intensive efforts were made for easing search of textual data, making sense of and searching in large data-sets of images and other types of visual information is still quite an open challenge. This challenge draws attention of the research community from different disciplines, with the most canonical tasks targeted by the research being the tasks of image categorization and image labeling. In this work we propose contributing to this effort, focusing on the task of optimizing “naturalness” of the labels selected for image annotation in large-scale repositories. To date, a wide palette of algorithms for automated image categorization is available, with each of these algorithms shining on different sets of visual features. While these algorithms can be used directly, together or separately, to automatically labeling images with a description of their content, when image categorization/labeling is performed by humans, the output is affected by numerous factors that go beyond pure visual, such as the overall content of the data repository, cultural background of the users, their information needs, and common knowledge about various semantic properties of the entities in the images. Thus, optimizing automated image labeling with respect to end-usage of the images requires taking these factors into account to the largest extent possible, making the labels more “natural” in a larger context of information consumption and communication. Since automatic image classifiers inherently do not refer to these factors, there is a gap that needs bridging over. The goal of our work is to get closer to the category/label a typical user of a given data repository would apply to an image.