Momentum-based acceleration of first-order optimization methods, first introduced by Nesterov, has been foundational to the theory and practice of large-scale optimization and machine learning. However, finding a fundamental understanding of such acceleration remains a long-standing open problem. In the past few years, several new acceleration mechanisms, distinct from Nesterov's, have been discovered, and the similarities and dissimilarities among these new acceleration phenomena hint at a promising avenue of attack for the open problem. In this talk, we discuss the envisioned goal of developing a mathematical theory unifying the collection of acceleration mechanisms and the challenges that are to be overcome.
Bio: Ernest Ryu is an assistant professor in the Department of Mathematical Sciences and an affiliated faculty of the Graduate School of Artificial Intelligence at Seoul National University. His current research focus is on optimization and deep learning theory.
Professor Ryu received a B.S. degree in Physics and Electrical engineering with honors at the California Institute of Technology in 2010 and an M.S. in Statistics and a Ph.D. in Computational and Mathematical Engineering with the Gene Golub Best Thesis Award at Stanford University in 2016. In 2016, he joined the Department of Mathematics at the University of California, Los Angeles, as an Assistant Adjunct Professor. In 2020 he joined the Department of Mathematical Sciences at Seoul National University as a tenure-track faculty.