Exploring the Second Order Sparsity in Large Scale Optimization
Time: 2017-12-07
Published By: Xiaoni Tan
Speaker(s): Dr. Xudong Li, Princeton University
Time: 10:00-11:00 December 13, 2017
Venue: Room 9, Quan Zhai, BICMR
In this talk, we shall demonstrate how the second order sparsity (SOS) in important optimization problems such as the sparse optimization models, semidefinite programming, and many others can be explored to induce efficient algorithms.
The SOS property allows us to incorporate the semismooth Newton methods into the augmented Lagrangian method framework in a way that the subproblems involved only need low to medium costs, e.g., for lasso problems with sparse solutions, the costs for solving the subproblems at each iteration of our second order method are comparable or even lower than those in many first order methods.
Consequently, with the fast convergence rate in hand, usually asymptotically superlinear linear, we now reach the stage of being able to solve many challenging large scale convex optimization problems efficiently and robustly.
For the purpose of illustration, we present a highly efficient software called LassoNAL for solving the well-known Lasso-type problems.