Least Squares Revisited: Scalable Approaches for Multi-class Prediction

Citation:

A. Agarwal, S. M. Kakade, N. Karampatziakis, L. Song, and G. Valiant, Least Squares Revisited: Scalable Approaches for Multi-class Prediction. ICML: ArXiv Report, 2014.

Abstract:

This work provides simple algorithms for multi-class (and multi-label) prediction in settings where both the number of examples n and the data dimension d are relatively large. These robust and parameter free algorithms are essentially iterative least-squares updates and very versatile both in theory and in practice. On the theoretical front, we present several variants with convergence guarantees. Owing to their effective use of second-order structure, these algorithms are substantially better than first-order methods in many practical scenarios. On the empirical side, we present a scalable stagewise variant of our approach, which achieves dramatic computational speedups over popular optimization packages such as Liblinear and Vowpal Wabbit on standard datasets (MNIST and CIFAR-10), while attaining state-of-the-art accuracies.

Publisher's Version

See also: 2014
Last updated on 10/10/2021