Kenichi Nagasawa, University of Warwick

Smoothed Pairwise Difference Estimators: Distribution Theory and Bootstrap Inference
Date
Sep 23, 2019, 12:30 pm1:30 pm
Location
101 - Sherrerd Hall
Event Description

Finding planted "signals" in random "noise" is a theme that captures problems arising in several different areas such as machine learning (compressed sensing, matrix completion, sparse principal component analysis, regression, recovering planted clusters), average-case complexity (stochastic block model, planted clique, random constraint satisfaction), and cryptography (attacking security of pseudorandom generators). For some of these problems (e.g. variants of compressed sensing and matrix completion), influential works in the past two decades identified the right convex relaxation and techniques for analyzing it that guarantee nearly optimal (w.r.t. to the information theoretic threshold) recovery guarantees. However these methods are problem specific and do not immediately generalize to other problems/variants.

This talk is about a principled approach via the sum-of-squares method, to design and analyze a "right" convex relaxation that offers optimal recovery guarantees for a broad class of average case problems including all those listed above. I'll illustrate this approach by focusing on the example of a recent work that obtains the state-of-the-art algorithm for clustering mixtures of gaussians.

I'll explain how the process of coming up with the convex relaxation and its analysis in this paradigm can be blackboxed into showing a "simple" proof of unique identifiability (i.e., a statement that the signal (such as mixture components or added clique) is information-theoretically uniquely identified by the observed data). As a consequence of this method, we will be able to obtain efficient algorithms that beat the guarantees of all previously known methods for basic problems such as separating mixtures of gaussians and robustly estimating moments of high dimensional distributions.

Bio: Pravesh Kothari is a Research Instructor with a joint position at the School of Mathematics, Institute for Advanced Study and Department of Computer Science, Princeton University. In Fall 2019, he will join the faculty of the Computer Science Department at Carnegie Mellon University. His recent research has focused on using the sum-of-squares method to obtain a unified view of the complexity of typical instances of natural algorithmic problems arising in combinatorial optimization, theoretical machine learning, quantum information, algorithmic game theory and cryptography.