12-19天元基金几何与随机分析及其应用交叉讲座之95【郦旭东】

发布者:系统管理员发布时间:2017-12-14浏览次数:24


报告题目: Exploring the Second Order Sparsity in Large Scale Optimization
报告人:郦旭东 博士  Princeton University
报告时间:19号(周二)下午10:30--11:30
报告地点:1518
摘要:
In this talk, we shall demonstrate how the second order sparsity (SOS) in important optimization problems such as the sparse optimization models, semidefinite programming, and many others can be explored to induce efficient algorithms.
The SOS property allows us to incorporate the semismooth Newton methods into the augmented Lagrangian method framework in a way that the subproblems involved only need low to medium costs, e.g., for lasso problems with sparse solutions, the costs for solving the subproblems at each iteration of our second order method are comparable or even lower than those in many first order methods.
Consequently, with the fast convergence rate in hand, usually asymptotically superlinear linear, we now reach the stage of being able to solve many challenging large scale convex optimization problems efficiently and robustly.
For the purpose of illustration, we present a highly efficient software called LassoNAL for solving the well-known Lasso-type problems.