MAP Inference In Chains Using Column Generation
Abstract: In this talk we look at linear programming relaxations, duality, and column generation to derive a way to make exact MAP inference in linear chains and trees substantially faster. The proposed algorithm is based on a simple bounding strategy, and can be generalized to return approximate results (for arbitrary approximation quality) and act as a 0/1 loss oracle. Finally, the actual complexity at test time depends on how "easy" it is to perform inference in the given models, which means it's possible to train a model that will be faster at test time without changing its structure. We encourage further exploration of high-level reasoning about the optimization problem implicit in dynamic programs.
Bios: Alex Passos is a PhD student here at UMass, advised by Andrew McCallum. He has a bachelors from the Federal University of Bahia, Brazil, and has done some grad school in the University of Campinas, Brazil, before coming here. He is broadly interested in most of machine learning, and has published papers on non-parametric bayesian methods, topic models, multi-task learning, recommender systems, and LP relaxations, among other things.
David Belanger is a 2nd year MS/PhD student advised by Andrew McCallum. Before that he worked on multilingual optical handwriting recognition at BBN Technologies. He received a B.A. in mathematics from Harvard University, where he worked with Eric Dunham and Jim Rice on methods for numerically simulating earthquake ruptures along rough fault surfaces. Currently, his research focus is on machine learning related to natural language processing.