Ph.D Thesis
Ph.D Student Tetruashvili Lubov Three Algorithms for Large-Scale Constrained Optimization Problems with Applications Department of Industrial Engineering and Management Professor Amir Beck Professor Emeritus Aharon Ben-Tal

Abstract

In this thesis we present three algorithms for solving large scale optimization problems. Two of three methods belong to the class of first order gradient methods. The first algorithm presented in the thesis is an extension of the subgradient projection method for convex constrained problems. We use a non-Euclidian projection operator which is adjusted to the geometry of the feasible set. The idea of applying non-Euclidian operators was used in Mirror Descent method and thus we called our algorithm the Constrained Mirror (CoMirror) method. Our second algorithm - the Sequential Ascending Parameter (SAP) method - also solves convex constrained problems. The way we treat constrained problem in the SAP algorithm completely differs from the CoMirror algorithm. In the SAP method we replace our original problem by a sequence of parametric discrete minmax problems and use a very simple scheme for updating the parameter which is a lower bound for the optimal value of the problem. The SAP algorithm produces a sequence which converges to the optimal value. The third method can be viewed as a general scheme for solving nonconvex smooth optimization problems. At each iteration the nonconvex feasible set is approximated by an inner convex approximation. The latter is defined using an upper bound on the nonconvex constraint function. Under appropriate conditions on this upper bounding convex function, monotone convergence to a KKT point is established.