SU-CS254B MAY192025
Last edited: August 8, 2025Partial SAT Algorithms
- run very fast
- always give the right answer
- may time out (i.e., will give up on certain instances)
“Key question: can you make my algorithm fail?”
our answer…
- uniform random 3cnf instances
- n: number of variables
- \(\triangle\), “clause density”: the number of clauses; where \(\Delta = \frac{n}{m}\).
- output \(\phi\), \(\Delta\) n random clauses, independently chosen
SAT or UNSAT
Sampling \(\phi \sim R_{3}\qty(n, \triangle)\) likely to be SAT or UNSAT? By your choice of \(\Delta\), as: \(\Delta \geq 5.2 = \qty(\log_{\frac{7}{8}} \qty(\frac{1}{2}))\), then as \(\lim_{n\to \infty } \text{Pr}_{\phi \sim R_{3}\qty(n, \Delta)}\qty [\phi \text{SAT}] =0\).
SU-CS340R APR022024
Last edited: August 8, 2025SU-CS361 APR022024
Last edited: August 8, 2025Formal Formulation of Optimization
If we want to write down an optimization problem…
\begin{align} \min_{x} f(x) \\ s.t: x \in \mathcal{X} \end{align}
where:
- \(x\): “design point” (usually a vector representing the thing you are trying to optimize)
- which is an assignment of “design variable” to values (width, height, etc.)
- \(\mathcal{X}\): a feasible set of design points that satisfy the constraint
- \(f: \mathcal{X} \to \mathbb{R}\): the objective function, which
- \(x^{*}\): the minimizer—one (or many) points that satisfy the minimization scheme
Conventionally, we minimize.
SU-CS361 APR042024
Last edited: August 8, 2025optimization inequalities cannot be strict
Consider:
\begin{align} \min_{x}&\ x \\ s.t.\ & x > 1 \end{align}
this has NO SOLUTION. (1,1) wouldn’t actually be in feasible set. So, we usually specify optimization without a strict inequality.
So, instead, we write:
\begin{align} \min_{x}&\ x \\ s.t.\ & x \geq 1 \end{align}
Univariate Conditions
First order Necessary Condition
\begin{equation} \nabla f(x^{*}) = 0 \end{equation}
Second order necessary condition
\begin{equation} \nabla^{2}f(x^{*}) \geq 0 \end{equation}
SU-CS361 APR092024
Last edited: August 8, 2025More Bracketing Methods
Quadratic search
if your function is unimodal…
- Pick three points that gets “high, low, high”
- Fit a quadratic to it, evaluate its minima and add it to the point set
- Now, drop any of the four resulting point
Shubert-Piyavskill Method
This is a Bracketing approach which grantees optimality WITHOUT unimodality by using the Lipschitz Constant. But, this only works in one dimension.
Consider a Lipschitz continuous function with Lipschitz Constant \(L\). We can get our two initial points \(a\) and \(b\). First, we arbitrarily pick a point in the middle to evaluate; this will give us a cone (see Lipschitz Condition) which bounds the function.
