Recent Changes - Search:

MLFL Home

University of Massachusetts

MLFL Wiki

Using Nonparametric Bayesian Models To Learn Dynamical Systems

A challenge for designing RL agents capable of learning in highly structured but partially observable domains is representing uncertainty in both the state and the model. Such agents must often cope with rich observations (such as images), a possibly unknown number of objects, their properties/types and their dynamical interactions. The agent must be able to generalize radically to new situations and flexibly incorporate prior knowledge in the form of naturally occurring structures, such as trees, rings, and manifolds.

In this talk, I will discuss how hierarchical Bayesian models with structured nonparametric priors can be used to begin to capture these rich dynamical systems. We'll start with the Indian Buffet process, and show how it can be used to factor the world into multiple manifolds. I'll then present the Infinite Latent Events Model, which generalizes the IBP to learn a factored, causal model of the world. Time permitting, I will discuss a central challenge to using such models as part of a reinforcement learning agent, which is planning in partially observable domains where distributions over possible worlds do not have finite dimensional sufficient statistics.

This is joint work with Josh Tenenbaum and Noah Goodman.

Edit - History - Print - Recent Changes - Search
Page last modified on May 07, 2008, at 08:53 AM