ISNAS-DIP: Image Specific Neural Architecture Search for Deep Image Prior [CVPR 2022]

Overview

ISNAS-DIP: Image-Specific Neural Architecture Search for Deep Image Prior (CVPR 2022)

Metin Ersin Arican*, Ozgur Kara*, Gustav Bredell, Ender Konukoglu

[Paper] [Dataset]


News

This repo is the official implementation of ISNAS-DIP.


Overview

Visualization of proposed metrics

Recent works show that convolutional neural network (CNN) architectures have a spectral bias towards lower frequencies, which has been leveraged for various image restoration tasks in the Deep Image Prior (DIP) framework. The benefit of the inductive bias the network imposes in the DIP framework depends on the architecture. Therefore, researchers have studied how to automate the search to determine the best-performing model. However, common neural architecture search (NAS) techniques are resource and time-intensive. Moreover, best-performing models are determined for a whole dataset of images instead of for each image independently, which would be prohibitively expensive. In this work, we first show that optimal neural architectures in the DIP framework are image-dependent. Leveraging this insight, we then propose an image-specific NAS strategy for the DIP framework that requires substantially less training than typical NAS approaches, effectively enabling image-specific NAS. We justify the proposed strategy's effectiveness by (1) demonstrating its performance on a NAS Dataset for DIP that includes 522 models from a particular search space (2) conducting extensive experiments on image denoising, inpainting, and super-resolution tasks. Our experiments show that image-specific metrics can reduce the search space to a small cohort of models, of which the best model outperforms current NAS approaches for image restoration.

Getting Started

Installation

1- Clone the repo:

git clone https://github.com/ozgurkara99/ISNAS-DIP.git

2- Create a conda (suggested) environment and install the required packages:

conda create -n isnasdip python=3.8
pip install -r requirements.txt

3- If any of the packages listed in requirements.txt is failed to installed, install it manually, remove it from the txt file and rerun the above command.
4- Go to utils/paths.py and change the variable PROJECT_FOLDER to path of the current directory.

Usage

  • To run isnasdip experiment see the isnasdip.sh
  • To run nasdip experiment see the nasdip.sh
  • To run dip experiment see the dip.sh

Citation:

If you use our paper or dataset, please consider citing our paper:

@inproceedings{arican2022isnasdip,
  title={ISNAS-DIP: Image-Specific Neural Architecture Search for Deep Image Prior},
  author={Arican, Metin and Kara, Ozgur and Bredell, Gustav and Konukoglu, Ender},
  booktitle= {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2022}
}

Acknowledgements

nasdip.py and dip.py scripts borrow some codes from Chen et. al and Ulyanov et. al.

You might also like...
[ICLR 2021] HW-NAS-Bench: Hardware-Aware Neural Architecture Search Benchmark
[ICLR 2021] HW-NAS-Bench: Hardware-Aware Neural Architecture Search Benchmark

HW-NAS-Bench: Hardware-Aware Neural Architecture Search Benchmark Accepted as a spotlight paper at ICLR 2021. Table of content File structure Prerequi

BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search

BossNAS This repository contains PyTorch evaluation code, retraining code and pretrained models of our paper: BossNAS: Exploring Hybrid CNN-transforme

Official implementation of  Rethinking Graph Neural Architecture Search from Message-passing (CVPR2021)
Official implementation of Rethinking Graph Neural Architecture Search from Message-passing (CVPR2021)

Rethinking Graph Neural Architecture Search from Message-passing Intro The GNAS can automatically learn better architecture with the optimal depth of

"NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search".

NAS-Bench-301 This repository containts code for the paper: "NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search". The

[CVPR21] LightTrack: Finding Lightweight Neural Network for Object Tracking via One-Shot Architecture Search
[CVPR21] LightTrack: Finding Lightweight Neural Network for Object Tracking via One-Shot Architecture Search

LightTrack: Finding Lightweight Neural Networks for Object Tracking via One-Shot Architecture Search The official implementation of the paper LightTra

Code release to accompany paper "Geometry-Aware Gradient Algorithms for Neural Architecture Search."

Geometry-Aware Gradient Algorithms for Neural Architecture Search This repository contains the code required to run the experiments for the DARTS sear

Official PyTorch implementation of
Official PyTorch implementation of "Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets" (ICLR 2021)

Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets This is the official PyTorch implementation for the paper Rapid Neural A

code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"

code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"

 Few-shot Neural Architecture Search
Few-shot Neural Architecture Search

One-shot Neural Architecture Search uses a single supernet to approximate the performance each architecture. However, this performance estimation is super inaccurate because of co-adaption among operations in supernet.

Comments
  • Metrics calculated at initialization

    Metrics calculated at initialization

    Hi @ozgurkara99, I'm wondering about some of the details of the method: you mentioned that the metrics (PSD DB MSE etc.) were calculated at initialization with random CNN output and the corrupted image, does that mean you only passed the noise input into the model once without weight updating, and get the output from there?

    opened by FredXL1 1
  • missing 'model.csv' file?

    missing 'model.csv' file?

    Hi authors, thanks for the interesting work. I'm trying to run the nasdip.sh, and it seems that '/home/ersin/Documents/machine learning/NAS-DIP Summer Research/benchmark/models.csv' not uploaded?

    opened by YilinLiu97 1
  • "None of ['model name'] are in the columns"

    Hi @ozgurkara99, somehow I got this error when computing the metrics. It happened at random_search.py line 245. Seems that the metrics are not calculated but not sure why.

    opened by FredXL1 0
  • Train our own datasets

    Train our own datasets

    Hi @ozgurkara99, I'm wondering, how do we train on our dataset? I tried to modify the path in "utils/path.py" but it seems like the code is only taking caring of a single file with specific name (? Thank you.

    opened by YilinLiu97 1
Owner
Özgür Kara
Incoming ML PhD @ Gatech
Özgür Kara
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

What is DeepHyper? DeepHyper is a software package that uses learning, optimization, and parallel computing to automate the design and development of

DeepHyper Team 214 Jan 8, 2023
Densely Connected Search Space for More Flexible Neural Architecture Search (CVPR2020)

DenseNAS The code of the CVPR2020 paper Densely Connected Search Space for More Flexible Neural Architecture Search. Neural architecture search (NAS)

Jamin Fong 291 Nov 18, 2022
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)

DNA This repository provides the code of our paper: Blockwisely Supervised Neural Architecture Search with Knowledge Distillation. Illustration of DNA

Changlin Li 215 Dec 19, 2022
An implementation for Neural Architecture Search with Random Labels (CVPR 2021 poster) on Pytorch.

Neural Architecture Search with Random Labels(RLNAS) Introduction This project provides an implementation for Neural Architecture Search with Random L

null 18 Nov 8, 2022
[CVPR 2022] CoTTA Code for our CVPR 2022 paper Continual Test-Time Domain Adaptation

CoTTA Code for our CVPR 2022 paper Continual Test-Time Domain Adaptation Prerequisite Please create and activate the following conda envrionment. To r

Qin Wang 87 Jan 8, 2023
Deep Multimodal Neural Architecture Search

MMNas: Deep Multimodal Neural Architecture Search This repository corresponds to the PyTorch implementation of the MMnas for visual question answering

Vision and Language Group@ MIL 23 Dec 21, 2022
Model search is a framework that implements AutoML algorithms for model architecture search at scale

Model search (MS) is a framework that implements AutoML algorithms for model architecture search at scale. It aims to help researchers speed up their exploration process for finding the right model architecture for their classification problems (i.e., DNNs with different types of layers).

Google 3.2k Dec 31, 2022
The 7th edition of NTIRE: New Trends in Image Restoration and Enhancement workshop will be held on June 2022 in conjunction with CVPR 2022.

NTIRE 2022 - Image Inpainting Challenge Important dates 2022.02.01: Release of train data (input and output images) and validation data (only input) 2

Andrés Romero 37 Nov 27, 2022
Imposter-detector-2022 - HackED 2022 Team 3IQ - 2022 Imposter Detector

HackED 2022 Team 3IQ - 2022 Imposter Detector By Aneeljyot Alagh, Curtis Kan, Jo

Joshua Ji 3 Aug 20, 2022
[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective [PDF] Wuyang Chen, Xinyu Gong, Zhangyang Wang In ICLR 2

VITA 156 Nov 28, 2022