ORFE Professor Boris Hanin organized the first in-person Princeton Machine Learning Summer School, which was held from June 13th through June 17th. The school, supported by an NSF CAREER grant and funding from ORFE, CSML, PACM and SEAS, hosted approximately 60 student participants for an intensive week, consisting of four lecture courses.
The summer school began with courses by Sebastien Bubeck of MSR Redmond and Nati Srebro from TTIC and UChicago. Bubeck’s course began with an overview of his breakthrough work on the Lipschitz constant of interpolating functions in high dimensions, which he calls “The Law of Robustness.” Srebro’s course gave an overview of the vibrant field of implicit bias of initialization, optimization, and architecture in deep learning. This was followed by courses from Soledad Villar of Johns Hopkins and Tengyu Ma of Stanford. Villar spoke about the computational and theoretical aspects of the theory of equivariant (i.e. symmetry-aware) neural networks. Ma’s course focused on a different aspect of representation learning: a new insight connecting unsupervised contrastive pre-training and spectral graph theory.
Photo courtesy of Frank Wojciechowski.