国家数学与交叉科学中心合肥分中心报告【Xiangxiong Zhang】

发布者:系统管理员发布时间:2015-08-01浏览次数:11

报告题目:The asymptotic convergence rate of the Douglas Rachford iteration for basis pursuit

报告人:Xiangxiong Zhang, Department of Mathematics Purdue University

时  间:2015年8月5日    下午2:00―3:00

地  点:东区管理科研楼  数学科学学院1218室

内容提要:

For large scale nonsmooth convex optimization problems, first order methods involving only the subgradients are usually used thanks to their scalability to the problem size. Douglas-Rachford (DR) splitting is one of the most popular first order methods in practice. It is well-known that DR applied on dual problem is equivalent to the widely used alternating direction method of multipliers (ADMM) in nonlinear mechanics and the split Bregman method in image processing community. As motivating examples, first we will briefly review several famous convex recovery results including compressive sensing, matrix completion and PhaseLift, which represent a successful story of the convex relaxation approach attacking certain NP-hard linear inverse problems in the last decade. When DR is applied to these convex optimization problems, one interesting question of practical use is how the parameters in DR affect the performance. We will show an explicit formula of the sharp asymptotic convergence rate of DR for the simple L1 minimization. The analysis will be verified on examples of processing seismic data in Curvetlet domain. This is a joint work with Prof. Laurent Demanet at MIT.
 

欢迎广大师生参加!