SoftwareGradient Methods with Online Scalinghttps://github.com/udellgroup/osgm-best-hypergrad This repository implements the algorithms introduced in the Gradient Methods with Online Scaling series of papers. The framework learns online scaling factors for first-order methods, providing provable acceleration over classical gradient descent. It includes both the theoretical versions (OSGM-R, OSGM-H) and practical variants such as OSGM-Best, which often matches the performance of quasi-Newton methods like L-BFGS. The implementation supports convex and nonconvex objectives and integrates cleanly with standard Python optimization workflows. HDSDP: Software for Semidefinite Programminghttps://github.com/Gwzwpxz/hdsdp HDSDP (Homogeneous Dual-Scaling Semidefinite Programming) is an open-source solver for large-scale semidefinite programs (SDPs). It implements a dual-scaling interior-point algorithm enhanced by a simplified homogeneous self-dual embedding. HDSDP achieves high numerical stability and efficiency on low-rank and sparse SDP instances, demonstrating performance competitive with commercial solvers. The solver is part of the Cardinal Optimizer (COPT) and is distributed under an MIT license. For algorithmic details, see Algorithm 1055: HDSDP – Software for Semidefinite Programming, ACM Transactions on Mathematical Software (2025). |