Recent Changes - Search:

MLFL Home

University of Massachusetts

MLFL Wiki

Efficient Training Of LDA On AGPU By Mean-for-Mode Gibbs Sampling

Abstract:

We introduce Mean-for-Mode Gibbs sampling, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler --- and unlike an uncollapsed Gibbs sampler --- it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler --- and unlike a collapsed Gibbs sampler --- it is embarrassingly parallel, and can use approximate counters.

Bio:

Jean-Baptiste Tristan studied computer science and mathematics at the Ecole Normale Superieure of Paris. He received a Ph.D. in computer science from the Paris Diderot University in 2009. He then worked as a postdoctoral-fellow at Harvard University before joining Oracle Labs in 2011.

Jean-Baptiste previously worked in the field of formal software verification by interactive theorem proving. He received the 2011 "La recherche" award for his work on formally verified software equivalence checking algorithms. He currently works on the design and implementation of scalable machine learning algorithms for distributed and parallel architectures.

Edit - History - Print - Recent Changes - Search
Page last modified on February 16, 2015, at 09:37 PM