On The Global Convergence of Randomized Coordinate Gradient Descent for Non-Convex Optimization
发布时间:2024年01月03日
浏览次数:1364
发布者: Xiaoni Tan
主讲人: Zi'ang Chen(MIT)
活动时间: 从 2024-01-05 10:00 到 11:00
场地: Room 9, Quan Zhai, BICMR
Abstract: We analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes for non-convex optimization problems. Under generic assumptions, we prove that the algorithm iterate will almost surely escape strict saddle points of the objective function. As a result, the algorithm is guaranteed to converge to local minima if all saddle points are strict. Our proof is based on viewing coordinate descent algorithm as a nonlinear random dynamical system and a quantitative finite block analysis of its linearization around saddle points. This is joint work with Yingzhou Li and Jianfeng Lu.