Details
Doubly nonnegative (DNN) programming problems are known to be challenging to solve because of their huge number of O(n^2) constraints and O(n^2) variables. In this work, we introduce RiNNAL, a method for solving DNN relaxations of large-scale mixed-binary quadratic programs by leveraging their solutions’ possible low-rank property. RiNNAL is a globally convergent Riemannian based augmented Lagrangian method (ALM) that penalizes the nonnegative and complementarity constraints while preserving all other constraints as an algebraic variety. After applying the low-rank decomposition to the ALM subproblem, its feasible region becomes an algebraic variety with favorable geometric properties. Our low-rank decomposition model is different from the standard Burer-Monteiro (BM) decomposition model in that we make the crucial step to equivalently reformulate most of the quadratic constraints after the BM decomposition into fewer and more manageable affine constraints. This modification is also important in helping us to alleviate the violation of Slater’s condition for the primal DNN problem. Moreover, we make the crucial step to show that the metric projection onto the algebraic variety, although non-convex, can be transformed into a tractable convex optimization problem under certain regularity conditions. Numerous numerical experiments are conducted to validate the efficiency of the proposed RiNNAL method.
[Joint work with Di Hou and Tianyun Tang]