Coordinate Update Algorithms
主 题: Coordinate Update Algorithms
报告人: Prof. Wotao Yin (UCLA)
时 间: 2015-12-16 19:00-20:00
地 点: 北京国际数学研究中心全9教室
ADMM has been surprising us with numerical success on many non-convex optimization problems, which include, but not limited to, the minimization of Lq quasi-norm, Schatten-q quasi-norm, (0
0 is an integer. Our ADMM sequentially updates the primal variables in the order x_1,…,x_p,y, followed by updating the dual variable. We separate the variable y from x_i's since y has a special role in our analysis. Our results provide sufficient conditions for this nonconvex ADMM to converge with two, three, or more blocks, as they are special cases of our model. By applying our analysis, we show, for the first time, that several ADMM algorithms applied to solve nonconvex models in statistical learning, optimization on manifold, and matrix decomposition are guaranteed to converge. ADMM has been regarded as a variant to the augmented Lagrangian method (ALM). However, we present a simple nonconvex and no example to illustrate how ADMM converges but ALM (with a fixed penalty parameter) diverges.