Meta-SparseINR
Official PyTorch implementation of "Meta-learning Sparse Implicit Neural Representations" (NeurIPS 2021) by Jaeho Lee*, Jihoon Tack*, Namhoon Lee, and Jinwoo Shin.
TL;DR: We develop a scalable method to learn sparse neural representations for a large set of signals.
Illustrations of (a) an implicit neural representation, (b) the standard pruning algorithm that prunes and retrains the model for each signal considered, and (c) the proposed Meta-SparseINR procedure to find a sparse initial INR, which can be trained further to fit each signal.
1. Requirements
conda create -n inrprune python=3.7
conda activate inrprune
conda install pytorch torchvision cudatoolkit=11.1 -c pytorch -c nvidia
pip install torchmeta
pip install imageio einops tensorboardX
Datasets
- Download Imagenette and SDF file from the following page:
- One should locate the dataset into
/data
folder
2. Training
Training option
The option for the training method is as follows:
<DATASET>
: {celeba
,sdf
,imagenette
}
Meta-SparseINR (ours)
# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>
# Iterative pruning (magnitude pruning)
python main.py --exp metaprune --epoch 30000 --pruner MP --amount 0.2 --data <DATASET>
Random Pruning
# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>
# Iterative pruning (random pruning)
python main.py --exp metaprune --epoch 30000 --pruner RP --amount 0.2 --data <DATASET>
Dense-Narrow
# Train dense model with a given width
# Shell script style
widthlist="230 206 184 164 148 132 118 106 94 84 76 68 60 54 48 44 38 34 32 28"
for width in $widthlist
do
python main.py --exp meta_baseline --epoch 150000 --data <DATASET> --width $width --id width_$width
done
3. Evaluation
Evaluation option
The option for the training method is as follows:
<DATASET>
: {celeba
,sdf
,imagenette
}<OPT_TYPE>
: {default
,two_step_sgd
}, default denotes adam optimizer with 100 steps.
We assume all checkpoints are trained.
Meta-SparseINR (ours)
python eval.py --exp prune --pruner MP --data <DATASET> --opt_type <OPT_TYPE>
Baselines
# Random pruning
python eval.py --exp prune --pruner RP --data <DATASET> --opt_type <OPT_TYPE>
# Dense-Narrow
python eval.py --exp dense_narrow --data <DATASET> --opt_type <OPT_TYPE>
# MAML + One-Shot
python eval.py --exp one_shot --data <DATASET> --opt_type default
# MAML + IMP
python eval.py --exp imp --data <DATASET> --opt_type default
# Scratch
python eval.py --exp scratch --data <DATASET> --opt_type <OPT_TYPE>
4. Experimental Results
Citation
@inproceedings{lee2021meta,
title={Meta-learning Sparse Implicit Neural Representations},
author={Jaeho Lee and Jihoon Tack and Namhoon Lee and Jinwoo Shin},
booktitle={Advances in Neural Information Processing Systems},
year={2021}
}