publication . Preprint . 2018

Pyro: Deep Universal Probabilistic Programming

Bingham, Eli; Chen, Jonathan P.; Jankowiak, Martin; Obermeyer, Fritz; Pradhan, Neeraj; Karaletsos, Theofanis; Singh, Rohit; Szerlip, Paul; Horsfall, Paul; Goodman, Noah D.;
Open Access English
  • Published: 18 Oct 2018
Abstract
Pyro is a probabilistic programming language built on Python as a platform for developing advanced probabilistic models in AI research. To scale to large datasets and high-dimensional models, Pyro uses stochastic variational inference algorithms and probability distributions built on top of PyTorch, a modern GPU-accelerated deep learning framework. To accommodate complex or model-specific algorithmic behavior, Pyro leverages Poutine, a library of composable building blocks for modifying the behavior of probabilistic programs.
Subjects
arXiv: Quantitative Biology::Genomics
free text keywords: Computer Science - Machine Learning, Computer Science - Programming Languages, Statistics - Machine Learning
Download from
16 references, page 1 of 2

Bob Carpenter, Andrew Gelman, Matthew D. Ho man, Daniel Lee, Ben Goodrich, Michael Betancourt, Marcus Brubaker, Jiqiang Guo, Peter Li, and Allen Riddell. Stan: A Probabilistic Programming Language. Journal of Statistical Software, 76(1), 2017.

Joshua V. Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matt Ho man, and Rif A. Saurous. TensorFlow Distributions. arXiv:1711.10604, November 2017.

Hong Ge, Kai Xu, and Zoubin Ghahramani. Turing: A Language for Flexible Probabilistic Inference. In AISTATS, 2018.

Zoubin Ghahramani. Probabilistic machine learning and arti cial intelligence. Nature, 521: 452{459, May 2015. [OpenAIRE]

Noah D Goodman and Andreas Stuhlmuller. The Design and Implementation of Probabilistic Programming Languages. http://dippl.org, 2014.

Noah D. Goodman, Vikash K. Mansinghka, Daniel Roy, Keith Bonawitz, and Joshua B. Tenenbaum. Church: A Language for Generative Models. In UAI, 2008.

Matthew D. Ho man and Andrew Gelman. The No-U-turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res., 15(1), January 2014.

Ohad Kammar, Sam Lindley, and Nicolas Oury. Handlers in Action. In ICFP, 2013.

Diederik P Kingma and Max Welling. Auto-encoding Variational Bayes. In ICLR, 2014.

Diederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. Improved Variational Inference with Inverse Autoregressive Flow. In NIPS. 2016.

Rahul G Krishnan, Uri Shalit, and David Sontag. Structured Inference Networks for Nonlinear State Space Models. In AAAI, 2017.

Vikash K. Mansinghka, Ulrich Schaechtle, Shivam Handa, Alexey Radul, Yutian Chen, and Martin Rinard. Probabilistic Programming with Programmable Inference. In PLDI, 2018. [OpenAIRE]

N. Siddharth, Brooks Paige, Jan-Willem van de Meent, Alban Desmaison, Noah D. Goodman, Pushmeet Kohli, Frank Wood, and Philip Torr. Learning Disentangled Representations with Semi-Supervised Deep Generative Models. In NIPS, 2017.

David Tolpin, Jan-Willem van de Meent, Hongseok Yang, and Frank Wood. Design and Implementation of Probabilistic Programming Language Anglican. In IFL, 2016. [OpenAIRE]

Dustin Tran, Matthew D. Ho man, Rif A. Saurous, Eugene Brevdo, Kevin Murphy, and David M. Blei. Deep Probabilistic Programming. In ICLR, 2017.

16 references, page 1 of 2
Abstract
Pyro is a probabilistic programming language built on Python as a platform for developing advanced probabilistic models in AI research. To scale to large datasets and high-dimensional models, Pyro uses stochastic variational inference algorithms and probability distributions built on top of PyTorch, a modern GPU-accelerated deep learning framework. To accommodate complex or model-specific algorithmic behavior, Pyro leverages Poutine, a library of composable building blocks for modifying the behavior of probabilistic programs.
Subjects
arXiv: Quantitative Biology::Genomics
free text keywords: Computer Science - Machine Learning, Computer Science - Programming Languages, Statistics - Machine Learning
Download from
16 references, page 1 of 2

Bob Carpenter, Andrew Gelman, Matthew D. Ho man, Daniel Lee, Ben Goodrich, Michael Betancourt, Marcus Brubaker, Jiqiang Guo, Peter Li, and Allen Riddell. Stan: A Probabilistic Programming Language. Journal of Statistical Software, 76(1), 2017.

Joshua V. Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matt Ho man, and Rif A. Saurous. TensorFlow Distributions. arXiv:1711.10604, November 2017.

Hong Ge, Kai Xu, and Zoubin Ghahramani. Turing: A Language for Flexible Probabilistic Inference. In AISTATS, 2018.

Zoubin Ghahramani. Probabilistic machine learning and arti cial intelligence. Nature, 521: 452{459, May 2015. [OpenAIRE]

Noah D Goodman and Andreas Stuhlmuller. The Design and Implementation of Probabilistic Programming Languages. http://dippl.org, 2014.

Noah D. Goodman, Vikash K. Mansinghka, Daniel Roy, Keith Bonawitz, and Joshua B. Tenenbaum. Church: A Language for Generative Models. In UAI, 2008.

Matthew D. Ho man and Andrew Gelman. The No-U-turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res., 15(1), January 2014.

Ohad Kammar, Sam Lindley, and Nicolas Oury. Handlers in Action. In ICFP, 2013.

Diederik P Kingma and Max Welling. Auto-encoding Variational Bayes. In ICLR, 2014.

Diederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. Improved Variational Inference with Inverse Autoregressive Flow. In NIPS. 2016.

Rahul G Krishnan, Uri Shalit, and David Sontag. Structured Inference Networks for Nonlinear State Space Models. In AAAI, 2017.

Vikash K. Mansinghka, Ulrich Schaechtle, Shivam Handa, Alexey Radul, Yutian Chen, and Martin Rinard. Probabilistic Programming with Programmable Inference. In PLDI, 2018. [OpenAIRE]

N. Siddharth, Brooks Paige, Jan-Willem van de Meent, Alban Desmaison, Noah D. Goodman, Pushmeet Kohli, Frank Wood, and Philip Torr. Learning Disentangled Representations with Semi-Supervised Deep Generative Models. In NIPS, 2017.

David Tolpin, Jan-Willem van de Meent, Hongseok Yang, and Frank Wood. Design and Implementation of Probabilistic Programming Language Anglican. In IFL, 2016. [OpenAIRE]

Dustin Tran, Matthew D. Ho man, Rif A. Saurous, Eugene Brevdo, Kevin Murphy, and David M. Blei. Deep Probabilistic Programming. In ICLR, 2017.

16 references, page 1 of 2
Powered by OpenAIRE Research Graph
Any information missing or wrong?Report an Issue