User Tools

Site Tools


Title: A fast drift method for convex programs


This paper considers convex programs with a general (possibly non-differentiable) convex objective function and Lipschitz continuous convex inequality constraint functions. A simple algorithm is developed and achieves an $O(1/t)$ convergence rate. Similar to the classical dual subgradient algorithm and the ADMM algorithm, the new algorithm has a parallel implementation when the objective and constraint functions are decomposable. However, the new algorithm has faster $O(1/t)$ convergence rate compared with the best known $O(1/\sqrt{t})$ convergence rate for the dual subgradient algorithm with averaged primals. Further, it can solve convex programs with nonlinear constraints, which cannot be handled by the ADMM algorithm. The new algorithm is applied to a multipath network utility maximization problem and yields a decentralized flow control algorithm with fast $O(1/t)$ convergence rate.

Bio: Hao Yu is currently a PhD student in EE department at University of Southern California, advised by Prof. Michael J. Neely. He received the B.Eng. in Electrical Engineering from Xi’an Jiaotong University, China, and the Mphil. degree in Electrical Engineering from the Hong Kong University of Science and Technology, Hong Kong. His research interests are in the areas of design and analysis of optimization algorithms, network optimization and network coding.

a_fast_drift_method_for_convex_programs.txt · Last modified: 2016/09/01 19:15 (external edit)