APP: Anytime Progressive Pruning
Diganta Misra1,2,3, Bharat Runwal2,4, Tianlong Chen5, Zhangyang Wang5, Irina Rish1,3
1 Mila - Quebec AI Institute,2 Landskape AI,3 UdeM,4 IIT-Delhi,5 VITA, UT-Austin
Requirements
To create a new conda environment with the dependencies used in this project, do:
conda env create -f app.yml
For running the code on Restricted-Imagenet Dataset, first install the robustness library from here and provide the imagenet_path argument as the path to the imaganet data folder.
Run the Code
Here is an example of running the Anytime Progressive Pruning (APP) on Cifar-10 dataset with 8 megabatches in total:
python main_anytime_train.py \
--data ../data \
--dataset cifar10 \
--arch resnet50 \
--seed 1 \
--epochs 50 \
--decreasing_lr 20,40 \
--batch_size 64 \
--weight_decay 1e-4 \
--meta_batch_size 6250 \
--meta_batch_number 8 \
--sparsity_level 4.5 \
--snip_size 0.20 \
--save_dir c10_r50
One-Shot pruning :
python main_anytime_one.py \
--data ../data \
--dataset cifar10 \
--arch resnet50 \
--seed 1 \
--epochs 50 \
--decreasing_lr 20,40 \
--batch_size 64 \
--weight_decay 1e-4 \
--meta_batch_size 6250 \
--meta_batch_number 8 \
--sparsity_level 4.5 \
--snip_size 0.20 \
--save_dir c10_OSP_r18
Baseline :
python main_anytime_baseline.py \
--data ../data \
--dataset cifar10 \
--arch resnet50 \
--seed 1 \
--epochs 50 \
--decreasing_lr 20,40 \
--batch_size 64 \
--weight_decay 1e-4 \
--meta_batch_size 6250 \
--meta_batch_number 8 \
--save_dir c10_BASE_r50
Cite:
@misc{misra2022app,
title={APP: Anytime Progressive Pruning},
author={Diganta Misra and Bharat Runwal and Tianlong Chen and Zhangyang Wang and Irina Rish},
year={2022},
eprint={2204.01640},
archivePrefix={arXiv},
primaryClass={cs.LG}
}