8-30国家数学与交叉科学中心合肥分中心报告【王成】

发布者:系统管理员发布时间:2017-08-29浏览次数:21

报告题目: Preconditioned Steepest Descent (PSD) solver  for regularized convex optimization problems

报告人:王成  Associate Professor at University of Massachusetts Dartmouth

报告时间:8月30日  10:45-11:30

地点:1218

摘要:

A few preconditioned steepest descent (PSD) solvers are presented 

for the certain optimization problems, in which the solution corresponds 

to a convex energy functional. The highest and lowest order terms are 

constant-coefficient, positive linear operators. By using the 

energy dissipation property, we derive a discrete bound for 

the solution, as well as an upper-bound for the second derivative 

of the energy. These bounds allow us to investigate the 

convergence properties of our method. In particular, a geometric 

convergence rate is shown for the nonlinear PSD iteration applied 

to the regularized equation, which provides a much sharper 

theoretical result over the existing works. Some numerical 

simulation results are also presented in the talk.