上海科技大学王浩副教授学术报告

科研楼18号楼1102

发布者:韩伟发布时间:2024-04-09浏览次数:10

报告题目:An iteratively reweigthed second-order method for nonconvex regularization

时      间:2024414(星期日) 16:00

地      点:科研楼18号楼1102

主      办数学与统计学院、分析数学及应用教育部重点实验室、福建省分析数学及应用重点实验室、统计学与人工智能福建省高校重点实验室福建省应用数学中心(福建师范大学

参加对象:感兴趣的老师和研究生

 

报告摘要This paper considers a class of nonconvex sparsity-promoting regularization problems with a twice continuously differentiable loss function. We present a second-order algorithm to solve this class of nonconvex and nonsmooth problems. Most existing algorithms are first-order methods, and a hybrid of the proximal gradient method and subspace regularized Newton method was proposed for $\ell_p$ regularization until recently. Our new method is also a hybrid method with main features including: (i) our method is based on the iteratively reweighted method with the regularization term being iteratively approximated by a weighted l1 regularization term, so that it can be applied to various nonconvex regularization problems. (ii) Our method alternatively solves the l1 approximation by a soft-thresholding step and the subspace approximate Newton step. (iii) The iterates generated by our algorithm have unchanged sign values, and the nonzero components are bounded away from 0 for sufficiently large iterations, and the algorithm eventually reverts to a perturbed Newton method. (iv) We prove global convergence and a local quadratic convergence rate under loose assumptions for our method and demonstrate its efficiency on a large set of model prediction problems.

 

报告人简介:王浩博士,上海市青年东方学者。现任上海科技大学信息科学与技术学院副教授,于20155月在美国Lehigh大学工业工程系获得博士学位,并于2010年和2007年在北京航空航天大学数学与应用数学系分别获得理学硕士和学士学位。当前研究领域主要为非线性优化、非凸正则化问题等问题和算法。主要成果在SIAM Journal on OptimizationJournal of Machine Learning ResearchIEEE on Computers等刊物上发表。