Mirror Duality in Convex Optimization


While first-order optimization methods are usually designed to efficiently reduce the function value f(x), there has been recent interest in methods efficiently reducing the magnitude of ∇f(x), and the findings show that the two types of methods exhibit a certain symmetry. In this work, we present mirror duality, a one-to-one correspondence between mirror-descent-type methods reducing function value and reducing gradient magnitude. Using mirror duality, we obtain the dual accelerated mirror descent (dual-AMD) method that efficiently reduces ψ∗(∇f(x)), where ψ is a distance-generating function and ψ∗ quantifies the magnitude of ∇f(x). We then apply dual-AMD to efficiently reduce ∥∇f(⋅)∥q for q∈[2,∞) and to efficiently compute ε-approximate solutions of the optimal transport problem.

arXiv preprint
Jaeyeon Kim
Jaeyeon Kim
Ph.D. Student

My research interest spans theoretical perspective of Machine Learning.