|
Blog
I like playing with interesting ideas and do some analysis on toy problems (like quadratics).
-
Non-asymptotic local convergence analysis of alternating minimization.
Wenzhi Gao. 04/12/2026.
HTML
|
PDF
We provide a proof template for obtaining non-asymptotic local linear convergence rates of two-block alternating minimization (AM). Unlike traditional Jacobian-based asymptotic arguments, we directly show convergence of the function value gap, achieving equally tight but non-asymptotic guarantees.
-
Online Learning to Precondition I. Space dilation methods for linear systems.
Wenzhi Gao. 03/16/2026.
HTML
|
PDF
This is the beginning of a series of posts on Online Learning to Precondition (OL2P): learning to improve the optimization landscape through the behavior of algorithms. As the starting post, we derive a new algorithm for solving linear systems that achieves superlinear convergence, using a space dilation principle motivated by online learning.
-
Hypergradient acceleration.
Wenzhi Gao (with Yifa Yu). 03/06/2026.
HTML
|
PDF
|
Code
This post studies hypergradient descent (HDM), a stepsize adaptation heuristic for
gradient-based methods. We show that when the objective function becomes flat around the optimum, HDM can automatically achieve accelerated convergence rate.
-
Negative stepsizes also make multi-block ADMM converge.
Wenzhi Gao. 03/02/2026.
HTML
|
PDF
|
Code (divergence)
|
Code (negative stepsize)
Inspired by a recent result that negative stepsizes make gradient descent ascent converge
on bilinear counterexamples, this post shows that periodically using a negative dual
stepsize also fixes the divergence of multi-block ADMM on classical quadratic
counterexamples.
|