MLFL Wiki |
Main /
Hierarchical Bayesian Transfer Learning For Probabilistic Action-Effect RulesThe ways in which an agent's actions affect the world can often be modeled compactly using a set of relational probabilistic rules. I will discuss our work on learning such rule sets for multiple related tasks. We take a hierarchical Bayesian approach, in which the system learns a prior distribution over rule sets. We have developed a class of prior distributions parameterized by a "rule set prototype" that is modified stochastically to produce task-specific rule sets. Our learning algorithm operates by coordinate ascent, iteratively optimizing the task-specific rule sets and the rule set prototype. Experiments show that transferring information from related tasks significantly reduces the amount of training required to predict action effects in blocks-world domains. In addition to these transfer learning results, I will discuss some broader lessons about Bayesian structure learning for relational models. (This is joint work with Ashwin Deshpande, Luke Zettlemoyer, and Leslie Kaelbling.) |