Forecasting for knowable future events using Bayesian informative priors (forecasting with judgmental-adjustment).

Overview

What is judgyprophet?

judgyprophet is a Bayesian forecasting algorithm based on Prophet, that enables forecasting while using information known by the business about future events. The aim is to enable users to perform forecasting with judgmental adjustment, in a way that is mathematically as sound as possible.

Some events will have a big effect on your timeseries. Some of which you are aware of ahead of time. For example:

  • An existing product entering a new market.
  • A price change to a product.

These events will typically cause a large change in your timeseries of e.g. product sales, which a standard statistical forecast will consistently underestimate.

The business will often have good estimates (or at least better than your statistical forecast) about how these events will affect your timeseries. But this is difficult to encode into your statistical forecasting algorithm. One option is to use a regressor, but this typically works poorly. This is because you have no data on the event before it occurs, and the statistical forecast does not know how to balance the information in your regressor and trend after the event occurs (which can lead to erratic behaviour).

judgyprophet solves this problem by encoding the business estimate of how the event will affect the forecast (the judgmental adjustment) as a Bayesian informative prior.

Before the event occurs, this business information is used to reflect the forecast of what will happen post-event e.g. the estimated uplift in product sales once the event has happened. After the event occurs, we update what the business thinks will happen, with what we see happening in the actuals. This is done using standard Bayesian updating.

Installation

1. install judgyprophet python package using pip

pip install judgyprophet

2. compile the STAN model

judgyprophet depends on STAN, whose models have to be compiled before running.

So to use judgyprophet, you have to compile the model. Do this in the shell using

python -c "from judgyprophet import JudgyProphet; JudgyProphet().compile()"

or in python using

from judgyprophet import JudgyProphet

JudgyProphet().compile()

This will take a while. But you only have to run this once, after the initial install.

Documentation

Full documentation is available on our Github Pages site here.

Scroll down for a quickstart tutorial.

A runnable jupyter notebook version of the quickstart tutorial is available here

Roadmap

Some things on our roadmap:

  • Currently judgyprophet STAN file is only tested on Unix-based Linux or Mac machines. We aim to fully test Windows machines ASAP.
  • Option to run full MCMC, rather than just L-BFGS.
  • Prediction intervals
  • Regressors/holidays

Quickstart Tutorial

Imagine your business currently operates in the US, but is launching its product in Europe. As a result it anticipates a sharp uptake in sales (which it has an estimate of). As your forecasting team, they come to you and ask you to account for this.

Let's look at how we might do this using judgyprophet with some example data, where we know what happened. First let's plot this:

from judgyprophet.tutorials.resources import get_trend_event

example_data = get_trend_event()
p = example_data.plot.line()

png

We can see that product sales increased sharply from about September 2020. Suppose it was a launch in a new market, and that the business had an initial estimate of the impact in May 2020. The business expected the slope increase to be 6.

Let's use judgyprophet to forecast this series from May 2020. We do this by encoding the initial business estimate as a trend event.

from judgyprophet import JudgyProphet
import pandas as pd
import seaborn as sns

# Create the expected trend events by consulting with the business
trend_events = [
    {'name': "New market entry", 'index': '2020-09-01', 'm0': 6}
]


# Cutoff the data to May 2020
data_may2020 = example_data.loc[:"2020-05-01"]

# Make the forecast with the business estimated level event
# We have no level events, so just provide the empty list.
jp = JudgyProphet()
# Because the event is beyond the actuals, judgyprophet throws a warning.
#    This is just because the Bayesian model at the event has no actuals to learn from.
#    The event is still used in predictions.
jp.fit(
    data=data_may2020,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
WARNING:judgyprophet.judgyprophet:Post-event data for trend event New market entry less than 0 points. Event deactivated in model. Event index: 2020-09-01, training data end index: 2019-06-01 00:00:00
WARNING:judgyprophet.utils:No active trend or level events (i.e. no event indexes overlap with data). The model will just fit a base trend to the data.


Initial log joint probability = -3.4521
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
       3      -2.92768      0.054987   8.11433e-14           1           1        7
Optimization terminated normally:
  Convergence detected: gradient norm is below tolerance

Because we are in May 2020, the forecasting algorithm has nothing to use for learning; so just uses the business estimate. Let's plot the result:

from judgyprophet.tutorials.resources import plot_forecast

plot_forecast(
    actuals=example_data,
    predictions=predictions,
    cutoff="2020-05-01",
    events=trend_events
)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -17.0121
Iteration  1. Log joint probability =    10.4753. Improved by 27.4875.
Iteration  2. Log joint probability =    12.7533. Improved by 2.27796.
Iteration  3. Log joint probability =    25.4696. Improved by 12.7163.
Iteration  4. Log joint probability =     26.707. Improved by 1.2374.
Iteration  5. Log joint probability =    26.7075. Improved by 0.000514342.
Iteration  6. Log joint probability =    26.7104. Improved by 0.00296558.
Iteration  7. Log joint probability =    26.7122. Improved by 0.00171322.
Iteration  8. Log joint probability =    26.7157. Improved by 0.00351772.
Iteration  9. Log joint probability =    26.7159. Improved by 0.000208268.
Iteration 10. Log joint probability =    26.7159. Improved by 6.64977e-05.
Iteration 11. Log joint probability =     26.716. Improved by 6.89899e-05.
Iteration 12. Log joint probability =     26.716. Improved by 3.06578e-05.
Iteration 13. Log joint probability =     26.716. Improved by 8.91492e-07.
Iteration 14. Log joint probability =     26.716. Improved by 8.71052e-09.

png

We can see judgyprophet is accounting for the increased trend, but the business slightly overestimated the increase in sales due to the product launch.

Let's fast forward to January 2021, the business want to reforecast based on their estimate, and what they've seen so far for the product launch. This is where judgyprophet comes into its own.

Once actuals are observed after the event has taken place, judgyprophet updates its estimate of what the event impact is. Let's look at this in action:

# Cutoff the data to January 2021
data_jan2021 = example_data.loc[:"2021-01-01"]

# Reforecast using the new actuals, not we are at Jan 2021
jp = JudgyProphet()
jp.fit(
    data=data_jan2021,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
INFO:judgyprophet.judgyprophet:Adding trend event New market entry to model. Event index: 2020-09-01, training data start index: 2019-06-01 00:00:00, training data end index: 2021-01-01 00:00:00. Initial gradient: 6. Damping: None.


Initial log joint probability = -309.562
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
      10      -1.64341   2.10244e-05   3.61281e-06           1           1       15
Optimization terminated normally:
  Convergence detected: relative gradient magnitude is below tolerance

Now let's plot the results:

plot_forecast(actuals=example_data, predictions=predictions, cutoff="2021-01-01", events=trend_events)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -24.5881
Iteration  1. Log joint probability =   -1.06803. Improved by 23.5201.
Iteration  2. Log joint probability =    11.6215. Improved by 12.6895.
Iteration  3. Log joint probability =    36.5271. Improved by 24.9056.
Iteration  4. Log joint probability =    37.3776. Improved by 0.850488.
Iteration  5. Log joint probability =    37.6489. Improved by 0.271259.
Iteration  6. Log joint probability =    37.6547. Improved by 0.00580657.
Iteration  7. Log joint probability =    37.7831. Improved by 0.128419.
Iteration  8. Log joint probability =    37.7884. Improved by 0.00527858.
Iteration  9. Log joint probability =     37.789. Improved by 0.000612124.
Iteration 10. Log joint probability =    37.7891. Improved by 9.93823e-05.
Iteration 11. Log joint probability =    37.7902. Improved by 0.00112416.
Iteration 12. Log joint probability =    37.7902. Improved by 3.17397e-06.
Iteration 13. Log joint probability =    37.7902. Improved by 1.59404e-05.
Iteration 14. Log joint probability =    37.7902. Improved by 5.06854e-07.
Iteration 15. Log joint probability =    37.7902. Improved by 6.87792e-07.
Iteration 16. Log joint probability =    37.7902. Improved by 4.82761e-08.
Iteration 17. Log joint probability =    37.7902. Improved by 2.50385e-07.
Iteration 18. Log joint probability =    37.7902. Improved by 6.60322e-09.

png

In this case, once judgyprophet observes the data post-event, the Bayesian updating starts to realise the business estimate is a bit large, so it reduces it.

This was a simple example to demonstrate judgyprophet. You can add many trend events into a single forecasting horizon, add damping. You can also add level events – changes in the forecasting level; and seasonality see our other tutorials for details about this.

You might also like...
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

Official implementation of
Official implementation of "Accelerating Reinforcement Learning with Learned Skill Priors", Pertsch et al., CoRL 2020

Accelerating Reinforcement Learning with Learned Skill Priors [Project Website] [Paper] Karl Pertsch1, Youngwoon Lee1, Joseph Lim1 1CLVR Lab, Universi

 Geometry-Free View Synthesis: Transformers and no 3D Priors
Geometry-Free View Synthesis: Transformers and no 3D Priors

Geometry-Free View Synthesis: Transformers and no 3D Priors Geometry-Free View Synthesis: Transformers and no 3D Priors Robin Rombach*, Patrick Esser*

 DETReg: Unsupervised Pretraining with Region Priors for Object Detection
DETReg: Unsupervised Pretraining with Region Priors for Object Detection

DETReg: Unsupervised Pretraining with Region Priors for Object Detection Amir Bar, Xin Wang, Vadim Kantorov, Colorado J Reed, Roei Herzig, Gal Chechik

This repository contains the data and code for the paper
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" (SPNLP@ACL2022)

GP-VAE This repository provides datasets and code for preprocessing, training and testing models for the paper: Diverse Text Generation via Variationa

Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors
Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors

Make-A-Scene - PyTorch Pytorch implementation (inofficial) of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors (https://arxiv.org/

Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors
Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors

Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository contains

Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors
Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors

Surface Reconstruction from Point Clouds by Learning Predictive Context Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository c

Comments
  • Bumping jupyter versions in dev dependencies for security patch.

    Bumping jupyter versions in dev dependencies for security patch.

    Patching dev dependencies in light of jupyter security issues:

    • https://github.com/advisories/GHSA-m87f-39q9-6f55
    • https://github.com/advisories/GHSA-p737-p57g-4cpr
    opened by jackcbaker 0
  • Unspecified argument used in judgyprophet.fit()

    Unspecified argument used in judgyprophet.fit()

    The docstring of judgyprophet.fit() states that the dict array fed into 'trend_events' argument only needs three values per dict:

    :param trend_events: A list of dictionaries. Each dict should have the following entries
                - 'index' the start index of the event (i.e. index = i assumes the start of the event
                    is at location actuals[i]). The index should be of the same type as the actuals index.
                - 'm0' the estimated gradient increase following the event
                - 'gamma' (Optional) the damping to use for the trend. This is a float between 0 and 1.
                    It's not recommended to be below 0.8 and must be 0 > gamma <= 1.
                    If gamma is missing from the dict, or gamma = 1, a linear trend is used (i.e. no damping).
    

    But it actually needs 4 to work - the missing one being 'name'.

    The only need for this value currently is logging purposes (lines 1059 and 1085). Perhaps remove this argument from the logging, or add it as a forth key in the dictionary in the docstring?

    opened by Andrew47658 2
  • Correction for the docstring for judgyprophet.fit()

    Correction for the docstring for judgyprophet.fit()

    The docstring for judgyprophet.fit() states:

    :param actuals: A pandas series of the actual timeseries to forecast.
                It is assumed there are no missing data points,
                i.e. x[1] is the observation directly following x[0], etc.
    

    But I believe this argument should be named data, not actuals in the docstring. Thanks!

    opened by Andrew47658 0
Releases(0.1.2)
Owner
AstraZeneca
Data and AI: Unlocking new science insights
AstraZeneca
aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)

Bayesian Methods for Hackers Using Python and PyMC The Bayesian method is the natural approach to inference, yet it is hidden from readers behind chap

Cameron Davidson-Pilon 25.1k Jan 2, 2023
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks

Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks. Bayesian-Torch is designed to be flexible and seamless in extending a deterministic deep neural network architecture to corresponding Bayesian form by simply replacing the deterministic layers with Bayesian layers.

Intel Labs 210 Jan 4, 2023
LBK 20 Dec 2, 2022
Rotation-Only Bundle Adjustment

ROBA: Rotation-Only Bundle Adjustment Paper, Video, Poster, Presentation, Supplementary Material In this repository, we provide the implementation of

Seong 51 Nov 29, 2022
Square Root Bundle Adjustment for Large-Scale Reconstruction

RootBA: Square Root Bundle Adjustment Project Page | Paper | Poster | Video | Code Table of Contents Citation Dependencies Installing dependencies on

Nikolaus Demmel 205 Dec 20, 2022
Deep Unsupervised 3D SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment.

(ACMMM 2021 Oral) SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment This repository shows two tasks: Face landmark detection and Fac

BoomStar 51 Dec 13, 2022
Deep Unsupervised 3D SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment.

(ACMMM 2021 Oral) SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment This repository shows two tasks: Face landmark detection and Fac

BoomStar 51 Dec 13, 2022
Poplar implementation of "Bundle Adjustment on a Graph Processor" (CVPR 2020)

Poplar Implementation of Bundle Adjustment using Gaussian Belief Propagation on Graphcore's IPU Implementation of CVPR 2020 paper: Bundle Adjustment o

Joe Ortiz 34 Dec 5, 2022
PyTorch implementation of the paper: Long-tail Learning via Logit Adjustment

logit-adj-pytorch PyTorch implementation of the paper: Long-tail Learning via Logit Adjustment This code implements the paper: Long-tail Learning via

Chamuditha Jayanga 53 Dec 23, 2022