Primal and Dual Predicted Decrease Approximation Methods
Abstract:
We introduce the notion of predicted decrease approximation (PDA) for constrained optimization, a flexible framework which includes as special cases known algorithms such as generalized conditional gradient, proximal gradient, greedy coordinate descent for separable constraints and working set methods for linear equality constraints with bounds. This allows to provide a unified convergence analysis for these methods. We further consider a partially strongly convex nonsmooth model and show that dual application of PDA-based methods yields new sublinear convergence rate estimates in terms of both primal and dual objectives. As an example of an application, we provide an explicit working set selection rule for SMO-type methods for training the support vector machine with an improved primal convergence analysis. The talk is based on a joint work with Edouard Pauwels and Shoham Sabach
Biography:Amir Beck is a Professor in the Department of Industrial Engineering at The Technion—Israel Institute of Technology. He has published numerous papers, has given invited lectures at international conferences, and was awarded the Salomon Simon Mani Award for Excellence in Teaching and the Henry Taub Research Prize. His research interests are in continuous optimization, including theory, algorithmic analysis, and applications. He is an associate editor of Mathematics of Operations Research, Mathematical Programming Series A, the Journal of Optimization Theory and Applications and an area editor for optimization in Operations Research. His research has been supported by various funding agencies, including the Israel Science Foundation, the German-Israeli Foundation, the Binational US-Israel foundation, the Israeli Science and Energy Ministries and the European community. Website: https://sites.google.com/site/amirbeck314/