Tractable optimization problems.
We can solve these problems reliably and efficiently.
constituents
- where \(x \in \mathbb{R}^{n}\) is a vector of variables
- \(f_{0}\) is the objective function, “soft” to be minimized
- \(f_{1} … f_{m}\) are the inequality constraints
- \(g_{1} … g_{p}\) are the inequality constraints
requirements
Generally of structure:
\begin{equation} \min f_{0}\qty(x) \end{equation}
subject to:
\begin{align} &f_{i} \qty(x) \leq 0, i = 1 \dots m \\ &Ax = b \end{align}
and have curvature constraints
\(f_{0}, … f_{m}\) are convex:
\begin{equation} f_{i}\qty(\theta x + \qty(1-\theta)y) \leq \theta f_{i}\qty(x) + \qty(1-\theta)f_{i}\qty(y) \end{equation}
for \(\theta \in [0,1]\). That is, \(f_{i}\) have non-negative curvature.
additional information
sign asymmetry
FUN IMPORTANT FACT: when an minimization problem maybe convex, sticking a negative sign (casting it to the maximization problem) is not necessarily convex. “Negative of things that curve down is things that curve up. This may no longer be convex.”
ease
“convex (nonnegative curvature) is easy”, “onconvex (negative curvature) is hard”.
DSLs for Convex Optimization
DSLs for convex optimization allows us to describe problems at a high level + close to the math; can transform the problem into a standard form and then solve it.
CVXPY
CVXPY allows us to cast convex optimization tasks into OOP code.
\begin{align} \min \mid Ax - b \mid^{2}_{2} \end{align}
object to:
\(x \geq 0\)
import cvxpy as cp
A,b = ...
x = cp.Variable(n)
obj = cp.norm2(A@x - b)**2
constraints = [x >= 0]
prob = cp.Problem(cp.Minimize(obj), constraints)
prob.solve()
