A Scalable Projective Scaling Algorithm for Lp Loss with Convex Penalizations
Document Type
Article
Publication Date
2-1-2015
Abstract
This paper presents an accurate, efficient, and scalable algorithm for minimizing a special family of convex functions, which have a lp loss function as an additive component. For this problem, well-known learning algorithms often have well-established results on accuracy and efficiency, but there exists rarely any report on explicit linear scalability with respect to the problem size. The proposed approach starts with developing a second-order learning procedure with iterative descent for general convex penalization functions, and then builds efficient algorithms for a restricted family of functions, which satisfy the Karmarkar's projective scaling condition. Under this condition, a light weight, scalable message passing algorithm (MPA) is further developed by constructing a series of simpler equivalent problems. The proposed MPA is intrinsically scalable because it only involves matrix-vector multiplication and avoids matrix inversion operations. The MPA is proven to be globally convergent for convex formulations; for nonconvex situations, it converges to a stationary point. The accuracy, efficiency, scalability, and applicability of the proposed method are verified through extensive experiments on sparse signal recovery, face image classification, and over-complete dictionary learning problems.
DOI
10.1109/TNNLS.2014.2314129
Montclair State University Digital Commons Citation
Zhou, Hongbo and Cheng, Qiang, "A Scalable Projective Scaling Algorithm for Lp Loss with Convex Penalizations" (2015). Department of Computer Science Faculty Scholarship and Creative Works. 64.
https://digitalcommons.montclair.edu/compusci-facpubs/64