Yue Lu, Harvard University

Nonlinear Random Matrices in Estimation and Learning: Equivalence Principles and Applications
Date
Feb 25, 2025, 4:30 pm5:30 pm

Details

Event Description

In recent years, new classes of structured random matrices have emerged in statistical estimation and machine learning. Understanding their spectral properties has become increasingly important, as these matrices are closely linked to key quantities such as the training and generalization performance of large neural networks and the fundamental limits of high-dimensional signal recovery. Unlike classical random matrix ensembles, these new matrices often involve nonlinear transformations, introducing additional structural dependencies that pose challenges for traditional analysis techniques.

In this talk, I will present a set of equivalence principles that establish asymptotic connections between various nonlinear random matrix ensembles and simpler linear models that are more tractable for analysis. I will then demonstrate how these principles can be applied to characterize the performance of kernel methods and random feature models across different scaling regimes and to provide insights into the in-context learning capabilities of attention-based Transformer networks.

Short Bio: Yue M. Lu is a Harvard College Professor and Gordon McKay Professor of Electrical Engineering and Applied Mathematics at Harvard University. He has also held visiting appointments at Duke University (2016) and the École Normale Supérieure (ENS) in Paris (2019). His research focuses on the mathematical foundations of high-dimensional statistical estimation and learning. His contributions have been recognized with several best paper awards (IEEE ICIP, ICASSP, and GlobalSIP), the ECE Illinois Young Alumni Achievement Award (2015), and the IEEE Signal Processing Society Distinguished Lecturership (2022). He is a Fellow of the IEEE (Class of 2024).

Event Category
ORFE Department Colloquia