Stochastic Normalizing Flows

Overview

Stochastic Normalizing Flows

We introduce stochasticity in Boltzmann-generating flows. Normalizing flows are exact-probability generative models that can efficiently sample x and compute the generation probability p(x), so that probability-based methods can be used to train the generator. Boltzmann-generating flows combine flows and reweighting in order to learn to generate unbiased samples with respect to some target density exp(-u(x)) that is approximated by p(x) and then reweighted. Here we introduce sochasticity in Boltzmann-generating flows. The key methodological advance is that we avoid to compute p(x) point-wise, which would require an intractable integration over all paths mapping to the same x, and show how both training of the flow and reweighting of p(x) to exp(-u(x)) can be done via path sampling and without requiring p(x) explicitly.

Stochastic Normalizing Flows mix invertible neural networks and stochastic sampling layers

Publication

Please find the arxiv preprint here: https://arxiv.org/abs/2002.06707

Stochastic Normalizing Flows is in press in NeurIPS 2020, citation update is coming up...

@article{snf,
  title={Stochastic Normalizing Flows},
  author={H. Wu and J. K{\"o}hler and F. {\'e}},
  journal = {arxiv:2002.06707},
  year = {2020}
}

Installation and running experiments

System requirements All experiments were run with Python 3.7 and PyTorch 1.5 on MacOS. They are expected to work on MacOS and Linux systems with these or newer Python and PyTorch versions

Installation Install the bgtorch flow package

cd bgtorch
python setup.py develop
cd ..

Install snf_code package (specialized code for this paper)

cd snf_code
python setup.py develop
cd ..

Optional: install OpenMM for running experiment 3

conda install -c omnia openmm 
conda install -c omnia openmmtools 

Run Experiments

  • To run experiments 1-3, open and run the respective notebooks with jupyter
  • To run experiment 4, run the respective Python file directly
You might also like...
Implementation of Stochastic Image-to-Video Synthesis using cINNs.
Implementation of Stochastic Image-to-Video Synthesis using cINNs.

Stochastic Image-to-Video Synthesis using cINNs Official PyTorch implementation of Stochastic Image-to-Video Synthesis using cINNs accepted to CVPR202

DeepLM: Large-scale Nonlinear Least Squares on Deep Learning Frameworks using Stochastic Domain Decomposition (CVPR 2021)
DeepLM: Large-scale Nonlinear Least Squares on Deep Learning Frameworks using Stochastic Domain Decomposition (CVPR 2021)

DeepLM DeepLM: Large-scale Nonlinear Least Squares on Deep Learning Frameworks using Stochastic Domain Decomposition (CVPR 2021) Run Please install th

The official implementation of You Only Compress Once: Towards Effective and Elastic BERT Compression via Exploit-Explore Stochastic Nature Gradient.
The official implementation of You Only Compress Once: Towards Effective and Elastic BERT Compression via Exploit-Explore Stochastic Nature Gradient.

You Only Compress Once: Towards Effective and Elastic BERT Compression via Exploit-Explore Stochastic Nature Gradient (paper) @misc{zhang2021compress,

On the model-based stochastic value gradient for continuous reinforcement learning

On the model-based stochastic value gradient for continuous reinforcement learning This repository is by Brandon Amos, Samuel Stanton, Denis Yarats, a

iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis
iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis

iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis Andreas Bl

Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks

Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks (SDPoint) This repository contains the cod

Binary Stochastic Neurons in PyTorch

Binary Stochastic Neurons in PyTorch http://r2rt.com/binary-stochastic-neurons-in-tensorflow.html https://github.com/pytorch/examples/tree/master/mnis

PyTorch implementation for SDEdit: Image Synthesis and Editing with Stochastic Differential Equations
PyTorch implementation for SDEdit: Image Synthesis and Editing with Stochastic Differential Equations

SDEdit: Image Synthesis and Editing with Stochastic Differential Equations Project | Paper | Colab PyTorch implementation of SDEdit: Image Synthesis a

PyTorch implementation for Stochastic Fine-grained Labeling of Multi-state Sign Glosses for Continuous Sign Language Recognition.

Stochastic CSLR This is the PyTorch implementation for the ECCV 2020 paper: Stochastic Fine-grained Labeling of Multi-state Sign Glosses for Continuou

Comments
  • LICENSE for repository

    LICENSE for repository

    Thanks for open sourcing this code. I would be interested in trying it out at DeepMind, but it makes it difficult unless the license is specified in the repository. If that is something you are able to do, it is most straightforward if you put in a LICENSE file. I can try if it has for instance Apache 2.0 or MIT though of course I can't advise you what license works best in your case. At DeepMind we use Apache 2.0. Happy to discuss further. Thanks very much, Alex

    opened by alexggmatthews 2
  • Reproducing LangevinVAE result in Table 3

    Reproducing LangevinVAE result in Table 3

    Thanks for your insightful paper and sharing code.

    I run your shared code on experiment 4 MNIST, I can reproduce results claimed in Table 3 except LangevinVAE. It reported NLL around 200 in multiple runs.

    I notice there could be a bug in your code, the line misses y in calling interpolated_force. But the results didn't change even I changed to

    x1 = x + stepsize * self.interpolated_force(x, y, lambda_) + torch.sqrt(2*stepsize) * torch.randn_like(x)
    
    opened by qsh-zh 5
Owner
AI4Science group, FU Berlin (Frank Noé and co-workers)
AI4Science group, FU Berlin (Frank Noé and co-workers)
TensorFlow implementation of "Variational Inference with Normalizing Flows"

[TensorFlow 2] Variational Inference with Normalizing Flows TensorFlow implementation of "Variational Inference with Normalizing Flows" [1] Concept Co

YeongHyeon Park 7 Jun 8, 2022
The code release of paper Low-Light Image Enhancement with Normalizing Flow

[AAAI 2022] Low-Light Image Enhancement with Normalizing Flow Paper | Project Page Low-Light Image Enhancement with Normalizing Flow Yufei Wang, Renji

Yufei Wang 176 Jan 6, 2023
Official PyTorch implementation of "ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows"

ArtFlow Official PyTorch implementation of the paper: ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows Jie An*, Siyu Huang*, Yibing

null 123 Dec 27, 2022
Official implementation of the paper DeFlow: Learning Complex Image Degradations from Unpaired Data with Conditional Flows

DeFlow: Learning Complex Image Degradations from Unpaired Data with Conditional Flows Official implementation of the paper DeFlow: Learning Complex Im

Valentin Wolf 86 Nov 16, 2022
Code for "Causal autoregressive flows" - AISTATS, 2021

Code for "Causal Autoregressive Flow" This repository contains code to run and reproduce experiments presented in Causal Autoregressive Flows, present

Ricardo Pio Monti 35 Dec 16, 2022
Generative Autoregressive, Normalized Flows, VAEs, Score-based models (GANVAS)

GANVAS-models This is an implementation of various generative models. It contains implementations of the following: Autoregressive Models: PixelCNN, G

MRSAIL (Mini Robotics, Software & AI Lab) 6 Nov 26, 2022
Official code for Score-Based Generative Modeling through Stochastic Differential Equations

Score-Based Generative Modeling through Stochastic Differential Equations This repo contains the official implementation for the paper Score-Based Gen

Yang Song 818 Jan 6, 2023
Storchastic is a PyTorch library for stochastic gradient estimation in Deep Learning

Storchastic is a PyTorch library for stochastic gradient estimation in Deep Learning

Emile van Krieken 140 Dec 30, 2022
Code for "Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations"

Infinitely Deep Bayesian Neural Networks with SDEs This library contains JAX and Pytorch implementations of neural ODEs and Bayesian layers for stocha

Winnie Xu 95 Nov 26, 2021
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks

Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks. Bayesian-Torch is designed to be flexible and seamless in extending a deterministic deep neural network architecture to corresponding Bayesian form by simply replacing the deterministic layers with Bayesian layers.

Intel Labs 210 Jan 4, 2023