Efficient Regularized Least Squares Classification
Document Type
Conference Proceeding
Publication Date
1-1-2004
Abstract
Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.
DOI
10.1109/CVPR.2004.331
Montclair State University Digital Commons Citation
Zhang, Peng and Peng, Jing, "Efficient Regularized Least Squares Classification" (2004). Department of Computer Science Faculty Scholarship and Creative Works. 249.
https://digitalcommons.montclair.edu/compusci-facpubs/249