Sublinear Time Low-Rank Approximation of Positive Semidefinite Matrices
Abstract: I will discuss recent progress on fast algorithms for approximating positive semidefinite (PSD) matrices, which are of central importance in machine learning, optimization, and computational science in general.
In particular, I will show how new techniques in recursive leverage score sampling yield a surprising algorithmic result: we give a method for computing a near optimal k-rank approximation to any n x n PSD matrix in n * k2 time. When k is not too large, our algorithm is sublinear -- i.e. it does not need to read all entries of the matrix. It thus bypasses a natural lower bound for general matrices and is the first method that exploits PSD structure to obtain provably faster runtimes.
I will also discuss connections between our work and recent advances on faster nonlinear kernel methods in machine learning. I'll highlight future research directions in this area and computational lower bounds that rule out certain further improvements.
Joint work with David Woodruff. To appear, FOCS'17. Manuscript available at: https://arxiv.org/abs/1704.03371.