Abstract: Polynomial optimization deals with optimizing a polynomial function over a feasible region defined by polynomial inequalities. This thus models a broad range of nonlinear nonconvex problems arising in many application areas, in particular within operations research.
In the past two decades dedicated methods have been introduced that permit to design hierarchies of tractable semidefinite relaxations that are based, on the one hand, on sums of squares of polynomials, and, on the other hand, on the dual theory of moments. These hierarchies permit to get both upper and lower bounds on the global minimum of the original problem. A crucial feature is that, under some mild compactness assumption, these bounds converge asymptotically to the global minimum. A natural question is to understand how the quality of the bounds depends on the level of the relaxation, which is governed by the maximum degree of the sums of squares it involves.
In this lecture we will introduce these hierarchies and discuss their performance analysis. We will focus on the analysis in some simple cases (like optimization on the hypercube) and present some of the proof techniques, like eigenvalue reformulation and links to extremal roots of orthogonal polynomials (for the upper bounds) and the polynomial kernel method (for the lower bounds).
This is based on joint works with Etienne de Klerk (U. Tilburg) and Lucas Slot (CWI).