Probabilistic Programming and Statistical Inference in PyTorch

Overview

PtStat

Probabilistic Programming and Statistical Inference in PyTorch.

Introduction

This project is being developed during my time at Cogent Labs.

The documentation is still WIP, a brief API description is reported below. The tests might also be helpful.

The API might change quickly during this initial development period.

API

The first dimension is the batch dimension, over which the samples are assumed to be independent.

# Random variables interface:

class RandomVariable:
    def size(self)        # --> (batch_size, rv_dimension)
        
    def log_pdf(self, x)  # --> [batch_size]

    def sample(self)      # --> [batch_size, rv_dimension]

    def entropy(self)     # --> [batch_size]


# Implemented random variables:

Normal(size=(batch_size, rv_dimension), cuda=cuda)
Normal(mu, sd)

Categorical(size=(batch_size, rv_dimension), cuda=cuda)
Categorical(p)

Bernoulli(size=(batch_size, rv_dimension), cuda=cuda)
Bernoulli(p)

Uniform(size=(batch_size, rv_dimension), cuda=cuda)

# KL-Divergence:

def kld(rv_from, rv_to)  # --> [batch_size]

Changelog

Version 0.2.0

  • removed specialized distributions => more flexible constructors
  • refactoring: distributions into multiple files

Version 0.1.0

  • initial commit

Licensing

The code is released under the MIT license.

You might also like...
Data-depth-inference - Data depth inference with python
Data-depth-inference - Data depth inference with python

Welcome! This readme will guide you through the use of the code in this reposito

Pytorch-diffusion - A basic PyTorch implementation of 'Denoising Diffusion Probabilistic Models'
Pytorch-diffusion - A basic PyTorch implementation of 'Denoising Diffusion Probabilistic Models'

PyTorch implementation of 'Denoising Diffusion Probabilistic Models' This reposi

Re-implementation of the Noise Contrastive Estimation algorithm for pyTorch, following "Noise-contrastive estimation: A new estimation principle for unnormalized statistical models." (Gutmann and Hyvarinen, AISTATS 2010)

Noise Contrastive Estimation for pyTorch Overview This repository contains a re-implementation of the Noise Contrastive Estimation algorithm, implemen

Official PyTorch implementation for FastDPM, a fast sampling algorithm for diffusion probabilistic models

Official PyTorch implementation for "On Fast Sampling of Diffusion Probabilistic Models". FastDPM generation on CIFAR-10, CelebA, and LSUN datasets. S

Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech"

GradTTS Unofficial Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech" (arxiv) About this repo This is an unoffic

Implementation of Retrieval-Augmented Denoising Diffusion Probabilistic Models in Pytorch

Retrieval-Augmented Denoising Diffusion Probabilistic Models (wip) Implementation of Retrieval-Augmented Denoising Diffusion Probabilistic Models in P

Statsmodels: statistical modeling and econometrics in Python

About statsmodels statsmodels is a Python package that provides a complement to scipy for statistical computations including descriptive statistics an

Cockpit is a visual and statistical debugger specifically designed for deep learning.
Cockpit is a visual and statistical debugger specifically designed for deep learning.

Cockpit: A Practical Debugging Tool for Training Deep Neural Networks

PyStan, a Python interface to Stan, a platform for statistical modeling. Documentation: https://pystan.readthedocs.io

PyStan NOTE: This documentation describes a BETA release of PyStan 3. PyStan is a Python interface to Stan, a package for Bayesian inference. StanĀ® is

Comments
  • Code review

    Code review

    Hello! I found your repo in pytorch discourse. I'm a pymc3 team member. Very happy that Bayes goes to pytorch too. We were planning to investigate if pytorch is suitable for bayesian reasoning. You can read our discussion here. I'd like to leave some comments for future improvements

    • Assertion errors can be hard to catch and if you have one, you can't figure out what type is it. I think some of assertions are not needed as they are checked at runtime without any supplementary code.

    • Now I see all stuff in code. Hope in future it will be more organized. Distribution submodule with splitting to distribution types can be a good solution.

    • Use pytest. I got used to unittest until tried pytest, now I have no way back to unittest, pytest is awesome

    • Use docstrings instead of # comment. It can be helpfull for future users

    • Remove NormalUnit and Uniform01, add defaults instead. Having lots of classes of the same distribution can be messy

    opened by ferrine 2
Owner
Stefano Peluchetti
Stefano Peluchetti
PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices.

PyTorch-LIT PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices. With

Amin Rezaei 157 Dec 11, 2022
Deep universal probabilistic programming with Python and PyTorch

Getting Started | Documentation | Community | Contributing Pyro is a flexible, scalable deep probabilistic programming library built on PyTorch. Notab

null 7.7k Dec 30, 2022
Code to run experiments in SLOE: A Faster Method for Statistical Inference in High-Dimensional Logistic Regression.

Code to run experiments in SLOE: A Faster Method for Statistical Inference in High-Dimensional Logistic Regression. Not an official Google product. Me

Google Research 27 Dec 12, 2022
aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)

Bayesian Methods for Hackers Using Python and PyMC The Bayesian method is the natural approach to inference, yet it is hidden from readers behind chap

Cameron Davidson-Pilon 25.1k Jan 2, 2023
Modular Probabilistic Programming on MXNet

MXFusion | | | | Tutorials | Documentation | Contribution Guide MXFusion is a modular deep probabilistic programming library. With MXFusion Modules yo

Amazon 100 Dec 10, 2022
PClean: A Domain-Specific Probabilistic Programming Language for Bayesian Data Cleaning

PClean: A Domain-Specific Probabilistic Programming Language for Bayesian Data Cleaning Warning: This is a rapidly evolving research prototype.

MIT Probabilistic Computing Project 190 Dec 27, 2022
Deep Probabilistic Programming Course @ DIKU

Deep Probabilistic Programming Course @ DIKU

null 52 May 14, 2022
DeepProbLog is an extension of ProbLog that integrates Probabilistic Logic Programming with deep learning by introducing the neural predicate.

DeepProbLog DeepProbLog is an extension of ProbLog that integrates Probabilistic Logic Programming with deep learning by introducing the neural predic

KU Leuven Machine Learning Research Group 94 Dec 18, 2022
Torchserve server using a YoloV5 model running on docker with GPU and static batch inference to perform production ready inference.

Yolov5 running on TorchServe (GPU compatible) ! This is a dockerfile to run TorchServe for Yolo v5 object detection model. (TorchServe (PyTorch librar

null 82 Nov 29, 2022
Monocular 3D pose estimation. OpenVINO. CPU inference or iGPU (OpenCL) inference.

human-pose-estimation-3d-python-cpp RealSenseD435 (RGB) 480x640 + CPU Corei9 45 FPS (Depth is not used) 1. Run 1-1. RealSenseD435 (RGB) 480x640 + CPU

Katsuya Hyodo 8 Oct 3, 2022