|M.Sc Thesis||Department of Electrical Engineering|
|Supervisor:||Prof. Meir Ron|
This thesis is devoted to the study of complexity regularized mixture classifiers. By mixture classifiers we refer to linear mixtures of classifiers from a basic class of classifiers. The utilization of such classifiers can be justified by adopting a Bayesian point of view, or alternatively, from complexity enrichment considerations. However, the class of mixture classifiers possesses considerable complexity, and thus complexity regularization is required.
We present a detailed analysis of complexity constrained mixture classifiers, and establish uniform loss bounds for the discrete class of constrained mixture classifiers. Further, we show that extending this framework to the continuous case yields identical results.
Recently, novel techniques have been introduced to address the problem of uniform loss bounds from a local perspective. By local perspective we refer to an approach based on considering a certain portion of the class of classifiers, which consists of `good' classifiers, rather than considering the entire class of classifiers directly. Various authors have established general loss bounds, yet have not addressed the class of mixture classifiers, nor have they provided explicit loss bounds. In this work we apply the results obtained by the local analysis to the class of mixture classifiers. We establish explicit loss bounds, and study their behavior. We consider their advantages and drawbacks, and provide means for overcoming these drawbacks, which rely on our analysis of complexity constrained mixture classifiers.
Finally, oracle inequalities are established, using the same techniques. In addition we provide an alternative interpretation of the results established from the perspective of Kernel methods.