Approximate Regularized Least Squares Algorithm for Classification
Document Type
Conference Proceeding
Publication Date
1-1-2018
Abstract
In machine learning, a good predictive model is the one that generalizes well over future unseen data. In general, this problem is ill-posed. To mitigate this problem, a predictive model can be constructed by simultaneously minimizing an empirical error over training samples and controlling the complexity of the model. Thus, the regularized least squares (RLS) is developed. RLS requires matrix inversion, which is expensive. And as such, its "big data" applications can be adversely affected. To address this issue, we have developed an efficient machine learning algorithm for pattern recognition that approximates RLS. The algorithm does not require matrix inversion, and achieves competitive performance against the RLS algorithm. It has been shown mathematically that RLS is a sound learning algorithm. Therefore, a definitive statement about the relationship between the new algorithm and RLS will lay a solid theoretical foundation for the new algorithm. A recent study shows that the spectral norm of the kernel matrix in RLS is tightly bounded above by the size of the matrix. This spectral norm becomes a constant when the training samples have independent centered sub-Gaussian coordinators. For example, typical sub-Gaussian random vectors such as the standard normal and Bernoulli satisfy this assumption. Basically, each sample is drawn from a product distribution formed from some centered univariate sub-Gaussian distributions. These new results allow us to establish a bound between the new algorithm and RLS in finite samples and show that the new algorithm converges to RLS in the limit. Experimental results are provided that validate the theoretical analysis and demonstrate the new algorithm to be very promising in solving "big data" classification problems.
DOI
10.1117/12.2305075
Montclair State University Digital Commons Citation
Peng, Jing and Aved, Alex J., "Approximate Regularized Least Squares Algorithm for Classification" (2018). Department of Computer Science Faculty Scholarship and Creative Works. 122.
https://digitalcommons.montclair.edu/compusci-facpubs/122
Published Citation
Jing Peng and Alex J. Aved "Approximate regularized least squares algorithm for classification", Proc. SPIE 10649, Pattern Recognition and Tracking XXIX, 106490S (30 April 2018); https://doi.org/10.1117/12.2305075