|M.Sc Student||Salman Tamer|
|Subject||Learning Polynomial Generating Rules Using Support Vector|
|Department||Department of Computer Science||Supervisors||PROFESSOR EMERITUS Yoram Baram|
|PROF. Alfred Bruckstein|
Model selection using support vector machines (SVM) has been widely applied and investigated in the radial basis functions (RBF) kernels case. The time consumption has been reduced meaningfully using smooth upper bounds on the generalization ability. However, there have been no conclusions about the analytical structure of the rule that generates the data and labels it. In this work we present a simple model selection algorithm for polynomial kernel SVM that results in tuning the polynomial degree of the used kernel to the best fit value. We show that this procedure provides us with some understanding of the analytical behavior of the generating rule of the data, assuming it is polynomial. Although the procedure is a heavy time consumer and many improvements from the RBF SVM model selection cannot be applied to it, it can be very helpful in understanding ``nature rules'' and analyzing certain qualities and behaviors. We present the case of the 2D circle recognition among 2D geometrical shapes, and show that some conclusions can be reached concerning the properties of circles that separate them from other geometrical shapes. A training sample consists of feature vectors of various properties of 2D shapes. A feature selection procedure can be applied to this data, choosing a partial set of features. The remaining features will be the discriminating ones; the area and the perimeter of the shape. The system is trained with various oracles that are a result of different mistake factors. One of the oracles is an average vote of a number of people. We show that the resulting system's success rate is better than the best member of the oracle group, and that the model selection procedure yields a quadratic relation between the area and the perimeter of the shape, a known mathematical fact.