Languages And Machines For Inductive Learning And Uncertain Reasoning
To engineer robust, adaptive, autonomous systems and explain human intelligence in computational terms, we must develop formal languages for the structured representation of uncertain knowledge and machines capable of efficiently solving the resulting inference problems. In this talk, I will present two layers in a stack of computing abstractions designed with these problems in mind.
First, I will present Church, a probabilistic programming language which provides a unified procedural notation for stochastic generative processes and uncertain beliefs. Church recovers the pure subset of Scheme in its deterministic limit, recovers McCarthy's amb in its deductive limit, and generalizes Lisp evaluation to conditional stochastic simulation for universal Bayesian inference. I will show how to use Church to compactly specify a range of problems, including nonparametric Bayesian models, planning as inference, and Bayesian learning of programs from data, and demonstrate an MCMC algorithm for approximately conditioning arbitrary Church programs.
Second, I will present combinational stochastic logic circuits, a probabilistically universal digital circuit abstraction that stands in relation to the probability algebra as Shannon's Boolean circuits stand in relation to the Boolean algebra. These circuits expose arbitrary time-space tradeoffs in stochastic computation, and can be used to cheaply implement massively parallel, fault tolerant, digital machines for Bayesian computation via exact and approximate sampling. I will also present price/performance estimates for factor graph inference problems on the IID, a natively stochastic computer we are developing based on these ideas.
Finally, I will touch on how these two layers can be connected into a coherent stack of computing abstractions for uncertain reasoning and inductive learning. This talk contains joint work with Noah Goodman, Daniel Roy, Eric Jonas and Josh Tenenbaum.