pymc-learn: Practical Probabilistic Machine Learning in Python
Contents:
What is pymc-learn?
pymc-learn is a library for practical probabilistic machine learning in Python.
It provides a variety of state-of-the art probabilistic models for supervised and unsupervised machine learning. It is inspired by scikit-learn and focuses on bringing probabilistic machine learning to non-specialists. It uses a syntax that mimics scikit-learn. Emphasis is put on ease of use, productivity, flexibility, performance, documentation, and an API consistent with scikit-learn. It depends on scikit-learn and PyMC3 and is distributed under the new BSD-3 license, encouraging its use in both academia and industry.
Users can now have calibrated quantities of uncertainty in their models using powerful inference algorithms -- such as MCMC or Variational inference -- provided by PyMC3. See :doc:`why` for a more detailed description of why pymc-learn
was created.
Note
pymc-learn
leverages and extends the Base template provided by the PyMC3 Models project: https://github.com/parsing-science/pymc3_models
Transitioning from PyMC3 to PyMC4
.@pymc_learn has been following closely the development of #PyMC4 with the aim of switching its backend from #PyMC3 to PyMC4 as the latter grows to maturity. Core devs are invited. Here's the tentative roadmap for PyMC4: https://t.co/Kwjkykqzup cc @pymc_devs https://t.co/Ze0tyPsIGH
— pymc-learn (@pymc_learn) November 5, 2018
Familiar user interface
pymc-learn
mimics scikit-learn. You don't have to completely rewrite your scikit-learn ML code.
from sklearn.linear_model \ from pmlearn.linear_model \
import LinearRegression import LinearRegression
lr = LinearRegression() lr = LinearRegression()
lr.fit(X, y) lr.fit(X, y)
The difference between the two models is that pymc-learn
estimates model parameters using Bayesian inference algorithms such as MCMC or variational inference. This produces calibrated quantities of uncertainty for model parameters and predictions.
Quick Install
pymc-learn
requires a working Python interpreter (2.7 or 3.5+). It is recommend installing Python and key numerical libraries using the Anaconda Distribution, which has one-click installers available on all major platforms.
Assuming a standard Python environment is installed on your machine (including pip), pymc-learn
itself can be installed in one line using pip:
You can install pymc-learn
from PyPi using pip as follows:
pip install pymc-learn
Or from source as follows:
pip install git+https://github.com/pymc-learn/pymc-learn
Caution!
pymc-learn
is under heavy development.
It is recommended installing pymc-learn
in a Conda environment because it provides Math Kernel Library (MKL) routines to accelerate math functions. If you are having trouble, try using a distribution of Python that includes these packages like Anaconda.
Dependencies
pymc-learn
is tested on Python 2.7, 3.5 & 3.6 and depends on Theano, PyMC3, Scikit-learn, NumPy, SciPy, and Matplotlib (see requirements.txt
for version information).
Quick Start
# For regression using Bayesian Nonparametrics
>>> from sklearn.datasets import make_friedman2
>>> from pmlearn.gaussian_process import GaussianProcessRegressor
>>> from pmlearn.gaussian_process.kernels import DotProduct, WhiteKernel
>>> X, y = make_friedman2(n_samples=500, noise=0, random_state=0)
>>> kernel = DotProduct() + WhiteKernel()
>>> gpr = GaussianProcessRegressor(kernel=kernel).fit(X, y)
>>> gpr.score(X, y)
0.3680...
>>> gpr.predict(X[:2,:], return_std=True)
(array([653.0..., 592.1...]), array([316.6..., 316.6...]))
Scales to Big Data & Complex Models
Recent research has led to the development of variational inference algorithms that are fast and almost as flexible as MCMC. For instance Automatic Differentation Variational Inference (ADVI) is illustrated in the code below.
from pmlearn.neural_network import MLPClassifier
model = MLPClassifier()
model.fit(X_train, y_train, inference_type="advi")
Instead of drawing samples from the posterior, these algorithms fit a distribution (e.g. normal) to the posterior turning a sampling problem into an optimization problem. ADVI is provided PyMC3.
Citing pymc-learn
To cite pymc-learn
in publications, please use the following:
Emaasit, Daniel (2018). Pymc-learn: Practical probabilistic machine learning in Python. arXiv preprint arXiv:1811.00542.
Or using BibTex as follows:
@article{emaasit2018pymc,
title={Pymc-learn: Practical probabilistic machine learning in {P}ython},
author={Emaasit, Daniel and others},
journal={arXiv preprint arXiv:1811.00542},
year={2018}
}
If you want to cite pymc-learn
for its API, you may also want to consider this reference:
Carlson, Nicole (2018). Custom PyMC3 models built on top of the scikit-learn API. https://github.com/parsing-science/pymc3_models
Or using BibTex as follows:
@article{Pymc3_models,
title={pymc3_models: Custom PyMC3 models built on top of the scikit-learn API,
author={Carlson, Nicole},
journal={},
url={https://github.com/parsing-science/pymc3_models}
year={2018}
}
License
Index
Getting Started
.. toctree:: :maxdepth: 1 :hidden: :caption: Getting Started install.rst support.rst why.rst
User Guide
The main documentation. This contains an in-depth description of all models and how to apply them.
.. toctree:: :maxdepth: 1 :hidden: :caption: User Guide user_guide.rst
Examples
Pymc-learn provides probabilistic models for machine learning, in a familiar scikit-learn syntax.
.. toctree:: :maxdepth: 1 :hidden: :caption: Examples regression.rst classification.rst mixture.rst neural_networks.rst
API Reference
pymc-learn
leverages and extends the Base template provided by the PyMC3 Models project: https://github.com/parsing-science/pymc3_models.
.. toctree:: :maxdepth: 1 :hidden: :caption: API Reference api.rst
Help & reference
.. toctree:: :maxdepth: 1 :hidden: :caption: Help & reference develop.rst support.rst changelog.rst cite.rst