|M.Sc Student||Gavinsky Dmitry|
|Subject||Boosting with Polynomially Bounded Distributions|
|Department||Department of Computer Science||Supervisor||Professor Nader Bshouty|
The research thesis is partly based on "On Boosting with Optimal Poly-Bounded Distributions" essay by Bshouty and Gavinsky.
We construct a framework which allows to turn into polynomially smooth (w. r. t. the PAC oracle's distribution) the distributions produced by some boosting algorithms, without performance loss. Further, we explore the case of Freund and Schapire's AdaBoost algorithm, bounding its distributions to polynomially smooth. The main advantage of AdaBoost over other boosting techniques is that it is adaptive, i.e., it is able to take advantage of weak hypotheses, which are more accurate than it was assumed a priori. We show that the feature of adaptiveness is preserved and improved by our technique.
Our scheme allows to execute AdaBoost in the on-line boosting mode (i.e., to perform boosting "by filtering"). Executed this way (and possessing the quality of smoothness), now AdaBoost may be efficiently applied to a wider range of learning problems than before.
In particular, we demonstrate AdaBoost 's application to the task of DNF membership learning, which (the application) results in "adaptive" number of boosting iterations and, consequently, adaptive size of the produced final hypothesis (thus answering affirmatively the question posed by Jackson).
The results we have achieved in this work open some interesting further research directions; in particular, two new publications based on the research thesis are currently being prepared for submission.