Continual learning paper repository
This repository contains an incomplete (but dynamically updated) list of papers exploring continual learning in machine learning and neuroscience, and accompanies the related paper Towards continual task learning in artificial neural networks: current approaches and insights from neuroscience.
- Machine learning reviews, surveys, & tutorials
- Classic papers
- Architectural approaches to continual learning
- Regularisation
- Training regime
- Neuroscience
- Citation
arXiv link
The full paper complementing this repository is available at https://arxiv.org/abs/2112.14146
Machine learning
Reviews, surveys, & tutorials
Title | Link | Relevance |
---|---|---|
Continual lifelong learning with neural networks: A review | Neural Networks | ••• |
Catastrophic forgetting in connectionist networks | TICS | ••• |
Neuroscience-Inspired Artificial Intelligence | Neuron | •• |
How to grow a mind: Statistics, structure, and abstraction | Science | • |
Deep learning | Nature | • |
Universal Intelligence: A Definition of Machine Intelligence | arXiv | • |
Classic papers
Title | Link | Relevance |
---|---|---|
Catastrophic forgetting in connectionist networks | TICS | ••• |
Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem | Psychology of Learning & Motivation | ••• |
Connectionist models of recognition memory: Constraints imposed by learning and forgetting functions | Psychological Review | • |
Architectural approaches to continual learning
Title | Link | Implementation |
---|---|---|
Progressive Neural Networks | arXiv | PyTorch TensorFlow |
Neurogenesis deep learning: Extending deep networks to accommodate new classes | IEEE | – |
Adaptive structural learning of artificial neural networks | ICML | TensorFlow |
Regularisation
Title | Link | Implementation |
---|---|---|
Learning without forgetting | arXiv | PyTorch |
Distilling the knowledge in a neural network | arXiv | PyTorch TensorFlow |
Overcoming catastrophic forgetting in neural networks | arXiv | PyTorch TensorFlow |
Note on the quadratic penalties in elastic weight consolidation | PNAS | – |
Measuring catastrophic forgetting in neural networks | arXiv | – |
Continual learning through synaptic intelligence | ICML | TensorFlow |
An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks | arXiv | Theano |
Training regime
Title | Link | Implementation |
---|---|---|
How transferable are features in deep neural networks? | arXiv | Caffe |
CHILD: A First Step Towards Continual Learning | Machine Learning | – |
Curriculum learning | ICML | – |
Continual Learning with Deep Generative Replay | arXiv | PyTorch |
Experience Replay for Continual Learning | arXiv | – |
Brain-inspired replay for continual learning with artificial neural networks | Nature Communications | PyTorch |
REMIND Your Neural Network to Prevent Catastrophic Forgetting | arXiv | PyTorch |
Neuroscience
Title | Link | Relevance |
---|---|---|
Regulation and function of adult neurogenesis: from genes to cognition | Physiological Review | • |
When and where do we apply what we learn?: A taxonomy for far transfer | Psychological Bulletin | • |
Does the hippocampus map out the future? | TICS | • |
Organizing conceptual knowledge in humans with a gridlike code | Science | • |
Song replay during sleep and computational rules for sensorimotor vocal learning | Science | • |
A theory of the discovery and predication of relational concepts | Psychological Review | • |
Preplay of future place cell sequences by hippocampal cellular assemblies | Nature | •• |
Comparing continual task learning in minds and machines | PNAS | •• |
Cascade models of synaptically stored memories | Neuron | • |
Selective suppression of hippocampal ripples impairs spatial memory | Nature Neuroscience | •• |
The analogical mind | MIT Press | • |
What learning systems do intelligent agents need? Complementary learning systems theory updated | TICS | • |
Human replay spontaneously reorganizes experience | Cell | ••• |
Compartmentalized dendritic plasticity and input feature storage in neurons | Nature | • |
Neuroconstructivism: How the brain constructs cognition | Oxford University Press | • |
Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory | Psychological Review | ••• |
The Role of Hippocampal Replay in Memory and Planning | Current Biology | •• |
Brain imaging of language plasticity in adopted adults: Can a second language replace the first? | Cerebral Cortex | ••• |
Memory formation: Let’s replay | Elife | • |
Strengthening individual memories by reactivating them during sleep | Science | •• |
Replay of neuronal firing sequences in rat hippocampus during sleep following spatial experience | Science | • |
Crossmodal spatial attention | Annals of the NY Academy of Sciences | • |
The merging of the senses | MIT Press | • |
Development of multisensory integration from the perspective of the individual neuron | Nature Reviews Neuroscience | •• |
The hippocampal indexing theory and episodic memory: updating the index | Hippocampus | •• |
Stably maintained dendritic spines are associated with lifelong memories | Nature | ••• |
Citation
BibTeX
@misc{mccaffary2021continual,
title={Towards continual task learning in artificial neural networks: current approaches and insights from neuroscience},
author={David McCaffary},
year={2021},
eprint={2112.14146},
archivePrefix={arXiv},
primaryClass={cs.LG}
}