Many optimization problems involve minimizing (maximizing) a convex (concave) function over a feasible region that is given by the intersection of a large number (either exponential or high degree polynomial) of constraints. For these problems, we know that if we can separate the feasible regions efficiently, then there exists an efficient algorithm to solve optimization problem. There is, however, no general practical algorithm currently.
We present a new, general purpose, algorithm for solving large constrained convex optimization problems. Our method combines Bregman projections and cutting planes and has a novel feature by which we forget all non-active constraints. We show that with two different types of separation oracles, our algorithm is theoretically guaranteed to converge to the optimal solution and that asymptotically the error decays exponentially. We test our method on two machine learning problems that have metric constraint; metric nearness, and correlation clustering. We also show that we can use our algorithm to efficiently train L2 SVMs. In many cases, our algorithm out performs the state of the art algorithms for the specific problem settings, solving instances that could not be solved before.