A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization
Abstract:
We propose a generic approach to accelerate gradient-based optimization algorithms with quasiNewton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.
Biography:Julien Mairal is a research scientist at Inria. He received a graduate degree from Ecole Polytechnique, France, in 2005, and a PhD degree from the Ecole Normale Supérieure, Cachan, France, in 2010. Then, he went to the statistics department of UC Berkeley as a post-doctoral researcher, before joining Inria in 2012. His research interests include machine learning, computer vision, and statistical image and signal processing. In 2013, he received the Cor Baayen prize, awarded every year by ERCIM to a promising young researcher in computer science and applied mathematics, and he received an ERC Starting grant in 2016.