Pytorch implementation of COIN, a framework for compression with implicit neural representations 🌸

Related tags

Deep Learning coin
Overview

COIN 🌟

This repo contains a Pytorch implementation of COIN: COmpression with Implicit Neural representations, including code to reproduce all experiments and plots in the paper.

Requirements

We ran our experiments with python 3.8.7 using torch 1.7.0 and torchvision 0.8.0 but the code is likely to work with earlier versions too. All requirements can be installed with

pip install -r requirements.txt

Usage

Compression

To compress the image kodak-dataset/kodim15.png, run

python main.py -ld logs_dir

This will save the COIN model and the reconstruction of the image (as well as logs of the losses and PSNR) to the logs_dir directory. To run on a specific image in the Kodak dataset, add the -iid flag. For example, to compress image 3, run

python main.py -ld logs_dir -iid 3

To compress the entire Kodak dataset, run

python main.py -ld logs_dir -fd

NOTE: The half precision version of torch.sin is only implemented in CUDA, so the half precision models can only be run on GPU, you need that to reproduce the results from the paper.

To reproduce the results from the paper, run the architectures listed in Appendix A

python main.py -ld logs_dir -fd --num_layers 5 --layer_size 20
python main.py -ld logs_dir -fd --num_layers 5 --layer_size 30
python main.py -ld logs_dir -fd --num_layers 10 --layer_size 28
python main.py -ld logs_dir -fd --num_layers 10 --layer_size 40
python main.py -ld logs_dir -fd --num_layers 13 --layer_size 49

Plots

To recreate plots from the paper, run

python plots.py

See the plots.py file to customize plots.

Acknowledgements

Our benchmarks and plots are based on the CompressAI library. Our SIREN implementation is based on lucidrains' implementation.

License

MIT

You might also like...
A coin flip game in which you can put the amount of money below or equal to 1000 and then choose heads or tail

COIN_FLIPPY ##This is a simple example package. You can use Github-flavored Markdown to write your content. Coinflippy A coin flip game in which you c

This is the pytorch implementation for the paper: *Learning Accurate Performance Predictors for Ultrafast Automated Model Compression*, which is in submission to TPAMI

SeerNet This is the pytorch implementation for the paper: Learning Accurate Performance Predictors for Ultrafast Automated Model Compression, which is

Implementation of "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement" by pytorch

This repository is used to suspend the results of our paper "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement"

An implementation of Group Fisher Pruning for Practical Network Compression based on pytorch and mmcv

FisherPruning-Pytorch An implementation of Group Fisher Pruning for Practical Network Compression based on pytorch and mmcv Main Functions Pruning f

Pytorch implementation for Patient Knowledge Distillation for BERT Model Compression

Patient Knowledge Distillation for BERT Model Compression Knowledge distillation for BERT model Installation Run command below to install the environm

Official Pytorch implementation for Deep Contextual Video Compression, NeurIPS 2021

Introduction Official Pytorch implementation for Deep Contextual Video Compression, NeurIPS 2021 Prerequisites Python 3.8 and conda, get Conda CUDA 11

An efficient and easy-to-use deep learning model compression framework
An efficient and easy-to-use deep learning model compression framework

TinyNeuralNetwork įŽ€äŊ“中文 TinyNeuralNetwork is an efficient and easy-to-use deep learning model compression framework, which contains features like neura

A Closer Look at Structured Pruning for Neural Network Compression

A Closer Look at Structured Pruning for Neural Network Compression Code used to reproduce experiments in https://arxiv.org/abs/1810.04622. To prune, w

Pytorch implementation for
Pytorch implementation for "Implicit Feature Alignment: Learn to Convert Text Recognizer to Text Spotter".

Implicit Feature Alignment: Learn to Convert Text Recognizer to Text Spotter This is a pytorch-based implementation for paper Implicit Feature Alignme

Comments
  • Training process

    Training process

    In the training process 51~125 lines in main.py, a single model trains images one by one.

    However, i thought it is better to train multiple models corresponding to each image, which means we have each model per image, because it causes more overfitting.

    Why didn't you do train models like such approach?


    Also, Let's say a single model with multiple images is better. But, why you fed images to a single model one by one? There is another way such as mini-batch training(batchsize = 4 or 8 or 16 ...).

    opened by kimwongyuda 2
Owner
Emilien Dupont
Machine Learning is coolbeans 🌴 twitter.com/emidup
Emilien Dupont
An Image compression simulator that uses Source Extractor and Monte Carlo methods to examine the post compressive effects different compression algorithms have.

ImageCompressionSimulation An Image compression simulator that uses Source Extractor and Monte Carlo methods to examine the post compressive effects o

James Park 1 Dec 11, 2021
Unofficial Tensorflow 2 implementation of the paper Implicit Neural Representations with Periodic Activation Functions

Siren: Implicit Neural Representations with Periodic Activation Functions The unofficial Tensorflow 2 implementation of the paper Implicit Neural Repr

Seyma Yucer 2 Jun 27, 2022
Official PyTorch implementation of Synergies Between Affordance and Geometry: 6-DoF Grasp Detection via Implicit Representations

Synergies Between Affordance and Geometry: 6-DoF Grasp Detection via Implicit Representations Zhenyu Jiang, Yifeng Zhu, Maxwell Svetlik, Kuan Fang, Yu

UT-Austin Robot Perception and Learning Lab 63 Jan 3, 2023
Code for Iso-Points: Optimizing Neural Implicit Surfaces with Hybrid Representations

Implementation for Iso-Points (CVPR 2021) Official code for paper Iso-Points: Optimizing Neural Implicit Surfaces with Hybrid Representations paper |

Yifan Wang 66 Nov 8, 2022
Code for the paper "Implicit Representations of Meaning in Neural Language Models"

Implicit Representations of Meaning in Neural Language Models Preliminaries Create and set up a conda environment as follows: conda create -n state-pr

Belinda Li 39 Nov 3, 2022
A Pytorch Implementation of a continuously rate adjustable learned image compression framework.

GainedVAE A Pytorch Implementation of a continuously rate adjustable learned image compression framework, Gained Variational Autoencoder(GainedVAE). N

null 39 Dec 24, 2022
PyTorch framework, for reproducing experiments from the paper Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks

Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks. Code, based on the PyTorch framework, for reprodu

Asaf 3 Dec 27, 2022
This repository contains the code for the CVPR 2020 paper "Differentiable Volumetric Rendering: Learning Implicit 3D Representations without 3D Supervision"

Differentiable Volumetric Rendering Paper | Supplementary | Spotlight Video | Blog Entry | Presentation | Interactive Slides | Project Page This repos

null 697 Jan 6, 2023
A lossless neural compression framework built on top of JAX.

Kompressor Branch CI Coverage main (active) main development A neural compression framework built on top of JAX. Install setup.py assumes a compatible

Rosalind Franklin Institute 2 Mar 14, 2022
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

Thalles Silva 1.7k Dec 28, 2022