[Distinguished Lecture]Quartic Regularity
Speaker(s): Yurii Nesterov(CORE/INMA, UCLouvain)
Time: 10:00-11:00 July 16, 2024
Venue: Wenyuan Lecture Hall, Zhihua Building, PKU
Abstract: In this talk, we propose new linearly convergent second-order methods for minimizing convex quartic polynomials. This framework is applied for designing optimization schemes, which can solve general convex problems satisfying a new condition of quartic regularity. It assumes positive definiteness and boundedness of the fourth derivative of the objective function. For such problems, an appropriate quartic regularization of Damped Newton Method has global linear rate of convergence. We discuss several important consequences of this result. In particular, it can be used for constructing new second-order methods in the framework of high-order proximal-point schemes. These methods have convergence rate $\tilde O(k^{-p})$, where k is the iteration counter, p is equal to 3, 4, or 5, and tilde indicates the presence of logarithmic factors in the complexity bounds for the auxiliary problems, which have to be solved at each iteration of the schemes.
Bio-Sketch: Yurii Nesterov an internationally recognized expert in convex optimization, especially in the development of efficient algorithms and numerical optimization analysis. He is currently a professor at the University of Louvain (UCLouvain). In 1977, he graduated in applied mathematics at Moscow State University. From 1977 to 1992 he was a researcher at the Central Economic Mathematical Institute of the Russian Academy of Sciences. Since 1993, he has been working at UCLouvain, specifically in the Department of Mathematical Engineering from the Louvain School of Engineering, Center for Operations Research and Econometrics. In 2000, Nesterov received the Dantzig Prize. In 2009, Nesterov won the John von Neumann Theory Prize. In 2016, Nesterov received the EURO Gold Medal. In 2023, Yurii Nesterov and Arkadi Nemirovski received the WLA Prize in Computer Science or Mathematics, "for their seminal work in convex optimization theory". Nesterov is most famous for his work in convex optimization, including his 2004 book, considered a canonical reference on the subject. His main novel contribution is an accelerated version of gradient descent that converges considerably faster than ordinary gradient descent (commonly referred as Nesterov momentum, Nesterov Acceleration or Nesterov accelerated gradient, in short — NAG). His work with Arkadi Nemirovski in their 1994 book is the first to point out that the interior point method can solve convex optimization problems, and the first to make a systematic study of semidefinite programming (SDP). Also in this book, they introduced the self-concordant functions which are useful in the analysis of Newton's method.