Dual Ascent
Dual ascent
Consider the constrained optimization problem
where $A\in \mathbb{R}^{m\times n}$, $b\in \mathbb{R}^{m}$ and $f:\mathbb{R}^{n}\rightarrow \mathbb{R}$ is a convex function.
Its Lagrangian function is
where y is a Lagrangian multiplier.
Its dual function is
where $f^{\ast}$ is a conjugate of $f$.
Then we have to optimize the dual function,
Consider the strong duality holds, the optimal values of the primal and dual solution are the same.
The primal optimal point $x^{\ast}$ is recovered from a dual optimal point $y^{\ast}$.
Assuming that $g$ is differentiable, we use gradient ascent method to find out the residual for the equality constraint.
Then the gradient $\triangledown g(y^{\ast} )$ can be updated as follows,
The dual ascent method consists of iterative steps
$1.$ Minimization step
where $k$ is number of iterations.
$2.$ Dual variable update step
where $\alpha^{k}$ is step size parameter.