Mammoth - An Extendible (General) Continual Learning Framework for Pytorch
NEWS
STAY TUNED: We are working on an update of this repository to include the codebase of our extended paper Class-Incremental Continual Learning into the eXtended DER-verse.
Official repository of Dark Experience for General Continual Learning: a Strong, Simple Baseline
Setup
- Use
./utils/main.py
to run experiments. - Use argument
--load_best_args
to use the best hyperparameters from the paper. - New models can be added to the
models/
folder. - New datasets can be added to the
datasets/
folder.
Models
- Gradient Episodic Memory (GEM)
- A-GEM
- A-GEM with Reservoir (A-GEM-R)
- Experience Replay (ER)
- Meta-Experience Replay (MER)
- Function Distance Regularization (FDR)
- Greedy gradient-based Sample Selection (GSS)
- Hindsight Anchor Learning (HAL)
- Incremental Classifier and Representation Learning (iCaRL)
- online Elastic Weight Consolidation (oEWC)
- Synaptic Intelligence
- Learning without Forgetting
- Progressive Neural Networks
- Dark Experience Replay (DER)
- Dark Experience Replay++ (DER++)
Datasets
Class-Il / Task-IL settings
- Sequential MNIST
- Sequential CIFAR-10
- Sequential Tiny ImageNet
Domain-IL settings
- Permuted MNIST
- Rotated MNIST
General Continual Learning setting
- MNIST-360
Citing this work
@inproceedings{buzzega2020dark,
author = {Buzzega, Pietro and Boschini, Matteo and Porrello, Angelo and Abati, Davide and Calderara, Simone},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
pages = {15920--15930},
publisher = {Curran Associates, Inc.},
title = {Dark Experience for General Continual Learning: a Strong, Simple Baseline},
volume = {33},
year = {2020}
}