Recent Changes - Search:

MLFL Home

University of Massachusetts

MLFL Wiki

Hierarchical Bayesian Transfer Learning For Probabilistic Action-Effect Rules

Brian Milch

The ways in which an agent's actions affect the world can often be modeled compactly using a set of relational probabilistic rules. I will discuss our work on learning such rule sets for multiple related tasks. We take a hierarchical Bayesian approach, in which the system learns a prior distribution over rule sets. We have developed a class of prior distributions parameterized by a "rule set prototype" that is modified stochastically to produce task-specific rule sets. Our learning algorithm operates by coordinate ascent, iteratively optimizing the task-specific rule sets and the rule set prototype. Experiments show that transferring information from related tasks significantly reduces the amount of training required to predict action effects in blocks-world domains. In addition to these transfer learning results, I will discuss some broader lessons about Bayesian structure learning for relational models. (This is joint work with Ashwin Deshpande, Luke Zettlemoyer, and Leslie Kaelbling.)

Edit - History - Print - Recent Changes - Search
Page last modified on October 09, 2007, at 07:23 AM