Recent Changes - Search:

MLFL Home

University of Massachusetts

MLFL Wiki

Intelligent Randomized Algorithms For The Low CP-Rank Tensor Approximation Problem

Abstract: In the context of numerical linear algebra algorithms, where it is natural to sacrifice accuracy in return for quicker computation of solutions whose errors are only slightly larger than optimal, the time-accuracy tradeoff of randomized sketching has been well-characterized. Algorithms such as Blendenpik and LSRN have shown that carefully designed randomized algorithms can outperform industry standard linear algebra codes such as those provided in LAPACK.

For numerical tensor algorithms, where the size of problems grow exponentially with the order of the tensor, it is even more desirable to use randomization. However, in this setting, the time-accuracy tradeoff of randomized sketching is more difficult to understand and exploit, as: (1) in the first place, tensor problems are non-convex, (2) the properties of the data change from iteration to iteration, and (3) straightforward applications of standard results on randomized sketching allow for the error to increase from iteration to iteration. On the other hand, the iterative nature of such algorithms opens up the opportunity to learn how to sketch more accurately in an online manner.

In this talk we consider the problem of speeding up the computation of low CP-rank (canonical polyadic) approximations of tensors through regularized sketching. We establish for the first time a sublinear convergence rate to approximate critical points of the objective under standard conditions, and further provide algorithms that adaptively select the sketching and regularization rates.

Bio: Alex Gittens is an assistant professor of computer science at Rensselaer Polytechnic Institute. He obtained his PhD in applied mathematics from CalTech in 2013, and BSes in mathematics and electrical engineering from the University of Houston. After his PhD, he joined the eBay machine learning research group, then the AMPLab (now the RISELab) at UC Berkeley, before joining RPI. His research interests lie at the intersection of randomized linear algebra and large-scale machine learning, in particular encompassing nonlinear and multilinear low-rank approximations; sketching for nonlinear and multilinear problems; and scalable and data-dependent kernel learning.

Edit - History - Print - Recent Changes - Search
Page last modified on March 05, 2019, at 09:53 AM