New features
Parallel sampling
Building on poutine.broadcast
, Pyro's SVI
and HMC
inference algorithms now support parallel sampling. For example to use parallel sampling in SVI, create an ELBO
instance and configure two particles
options, e.g.
elbo = Trace_ELBO(num_particles=100,
vectorize_particles=True)
Dependent enumeration
TraceEnum_ELBO
, HMC
, NUTS
, and infer_discrete can now exactly marginalize-out discrete latent variables. For dependency structures with narrow treewidth, Pyro performs cheap marginalization via message-passing algorithms, enabling classic learning algorithms such as Baum-Welch for HMMs, DBNs, and CRFs. See our enumeration tutorial for details.
HMC/NUTS enhancements
- Mass matrix can be learned along with step size during the warmup phase. This feature significantly improves the performance of HMC/NUTS.
- Multiple parallel chain is supported (on the CPU), together with various chain diagnostics such as R-hat and effective sample size.
- Models with discrete latent variables will be enumerated over in parallel.
- In NUTS, there are two ways of choosing a candidate from a trajectory: multinomial sampling and slice sampling. We have implemented and used multinomial sampling by default.
New distributions
MaskedMixture
interleaves two distributions element-wise, as a succinct alternative to multiple sample statements inside multiple poutine.mask
contexts.
RelaxedBernoulliStraightThrough
and RelaxedOneHotCategoricalStraightThrough
These are discrete distributions that have been relaxed to continuous space and thus are equipped with pathwise gradients. Thus these distributions can be useful in the context of variational inference, where they can provide lower variance (but biased) gradient estimates. Note that these distributions may be numerically finicky so please let us know if you run into any problems.
VonMises
and VonMises3D
are likelihood-only distributions that are useful for observing 2D or 3D angle data.
AVFMultivariateNormal
is a multivariate normal distribution that comes equipped with an adaptive gradient estimator that can lead to reduce gradient variance.
MixtureOfDiagNormals
, MixtureOfDiagNormalsSharedCovariance
and GaussianScaleMixture
are three families of mixture distributions that come equipped with pathwise gradient estimators (which tend to yield low variance gradients).
PlanarFlow
and PermutationFlow
are two transforms useful for constructing normalizing flows.
InverseAutoregressiveFlow
improvements such as an explicit inversion operator.
GPyTorch integration
This isn't really a feature of Pyro, but we'd like to point out a new feature of the excellent GPyTorch library: GPyTorch can now use Pyro for variational inference, and GPyTorch models can now be used in Pyro models. We recommend the new TraceMeanField_ELBO loss for GPyTorch models.
Analytic KL in ELBO
TraceMeanField_ELBO can take advantage of analytic KL divergence expressions in ELBO computations, when available. This ELBO implementation makes some restriction on variable dependency structure. This is especially useful for GPyTorch models.
IWELBO
An implementation of the Importance Weighted ELBO objective (pyro.infer.RenyiELBO
) is now included. This implementation also includes the generalization of IWELBO to the alpha-divergence (or Rényi divergence of order α) case.
Automatic max_plate_nesting
Pyro's ELBO
implementations can now automatically determine max_plate_nesting
(formerly know as max_iarange_nesting
) in models with static plate nesting structure.
Autoguide
Some new autoguides are implemented: AutoIAFNormal
and AutoLaplaceApproximation
.
Support for the PyTorch JIT
The PyTorch JIT compiler currently has only partial support for ops used in Pyro programs. If your model has static structure and you're lucky enough to use ops supported by the JIT, you can sometimes get an order-of-magnitude speedup. To enable the JIT in SVI, simply replace Trace_ELBO
, TraceGraph_ELBO
, or TraceEnum_ELBO
classes with their JIT wrappers JitTrace_ELBO
, JitTraceGraph_ELBO
, or JitTraceEnum_ELBO
. To enable the JIT in HMC
or NUTS
pass the jit_compile
kwarg. See our JIT tutorial for details.
Stats
pyro.ops.stats
contains many popular statistics functions such as resample
, quantile
, pi
(percentile interval), hpdi
(highest posterior density interval), autocorrelation
, autocovariance
, etc
Better error messages
Pyro now provides more validation checks and better error messages, including shape logging using the Trace.format_shapes() method. This is very useful for debugging shape errors. See the tensor shapes tutorial for help in reading the shapes table.
New Tutorials
Language
Examples
and additional examples in the examples directory.
Breaking changes
pyro.plate
replaces all of pyro.irange
, pyro.iarange
, and poutine.broadcast
. You should no longer need to use poutine.broadcast
manually.
independent()
is now renamed to_event()
poutine.mask
was separated from poutine.scale
. Now you should use poutine.mask
with ByteTensor
s and poutine.scale
for positive tensors (usually just scalars).
.enumerate_support(expand=False)
- Some distributions interfaces changed, e.g.
LowRankMultivariateNormal
and HalfNormal
- Pyro Gaussian Process module has been revised:
- Variational inference now works with PyTorch parameters directly instead of interacting with Pyro ParamStoreDict as before.
- Methods
.get_param(name)
and .fix_param(name, value)
are removed.
- Auto guide is supported through the method
.autoguide(name, ...)
. And we have implemented Bayesian GPLVM model to illustrate autoguide
functionality.
- Kernel methods
.sum()
, .product()
are removed. Instead, we encourage users using a better paradigm: Sum(kern0, kern1)
, Product(kern0, kern1)
.
Note also that Pyro 0.3 now uses PyTorch 1.0, which makes a number of breaking changes.
Experimental features
We are releasing the following features early, intended for experimental use only. Pyro provides no backward-compatibility guarantees for these features.
Tracking and data association
pyro.contrib.tracking
is provides some experimental components for data association and multiple-object tracking. See the object tracking and Kalman Filter tutorials for examples of using the library.
Optimal experimental design
pyro.contrib.oed
This package provides some support for doing Bayesian Optimal Experimental Design (OED) in Pyro. In particular it provides support for estimating the Estimated Information Gain, which is one of the key quantities required for Bayesian OED. This package is in active development and is expected to undergo significant changes. See the docs for more details.
Automatic naming and scoping
pyro.contrib.autoname
provides some automatic naming utilities that can ease the burden of subscripting like "x_{}".format(t)
.
Source code(tar.gz)
Source code(zip)