Think! Evidence

Natively probabilistic computation

Show simple item record

dc.contributor Joshua B. Tenenbaum.
dc.contributor Massachusetts Institute of Technology. Dept. of Brain and Cognitive Sciences.
dc.contributor Massachusetts Institute of Technology. Dept. of Brain and Cognitive Sciences.
dc.creator Mansinghka, Vikash Kumar
dc.date 2009-10-01T15:59:29Z
dc.date 2009-10-01T15:59:29Z
dc.date 2009
dc.date 2009
dc.identifier http://hdl.handle.net/1721.1/47892
dc.identifier 435463030
dc.description Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2009.
dc.description Includes bibliographical references (leaves 129-135).
dc.description I introduce a new set of natively probabilistic computing abstractions, including probabilistic generalizations of Boolean circuits, backtracking search and pure Lisp. I show how these tools let one compactly specify probabilistic generative models, generalize and parallelize widely used sampling algorithms like rejection sampling and Markov chain Monte Carlo, and solve difficult Bayesian inference problems. I first introduce Church, a probabilistic programming language for describing probabilistic generative processes that induce distributions, which generalizes Lisp, a language for describing deterministic procedures that induce functions. I highlight the ways randomness meshes with the reflectiveness of Lisp to support the representation of structured, uncertain knowledge, including nonparametric Bayesian models from the current literature, programs for decision making under uncertainty, and programs that learn very simple programs from data. I then introduce systematic stochastic search, a recursive algorithm for exact and approximate sampling that generalizes a popular form of backtracking search to the broader setting of stochastic simulation and recovers widely used particle filters as a special case. I use it to solve probabilistic reasoning problems from statistical physics, causal reasoning and stereo vision. Finally, I introduce stochastic digital circuits that model the probability algebra just as traditional Boolean circuits model the Boolean algebra.
dc.description (cont.) I show how these circuits can be used to build massively parallel, fault-tolerant machines for sampling and allow one to efficiently run Markov chain Monte Carlo methods on models with hundreds of thousands of variables in real time. I emphasize the ways in which these ideas fit together into a coherent software and hardware stack for natively probabilistic computing, organized around distributions and samplers rather than deterministic functions. I argue that by building uncertainty and randomness into the foundations of our programming languages and computing machines, we may arrive at ones that are more powerful, flexible and efficient than deterministic designs, and are in better alignment with the needs of computational science, statistics and artificial intelligence.
dc.description by Vikash Kumar Mansinghka.
dc.description Ph.D.
dc.format 135 leaves
dc.format application/pdf
dc.language eng
dc.publisher Massachusetts Institute of Technology
dc.rights M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.
dc.rights http://dspace.mit.edu/handle/1721.1/7582
dc.subject Brain and Cognitive Sciences.
dc.title Natively probabilistic computation
dc.type Thesis


Files in this item

Files Size Format View
435463030-MIT.pdf 61.47Mb application/pdf View/Open

Files in this item

Files Size Format View
435463030-MIT.pdf 61.47Mb application/pdf View/Open

Files in this item

Files Size Format View
435463030-MIT.pdf 61.47Mb application/pdf View/Open

This item appears in the following Collection(s)

Show simple item record

Search Think! Evidence


Browse

My Account