One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search Space Shrinking

Related tags

Deep Learning nas
Overview

One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search Space Shrinking

This is an official implementation for NEAS presented in CVPR 2021.

Environment Setup

To set up the enviroment you can easily run the following command:

git clone https://github.com/researchmm/NEAS.git
cd NEAS
conda create -n NEAS python=3.6
conda activate NEAS
sh ./install.sh
# (required) install apex to accelerate the training, a little bit faster than pytorch DistributedDataParallel
cd lib
git clone https://github.com/NVIDIA/apex.git
python ./apex/setup.py install --cpp_ext --cuda_ext

Data Preparation

You need to first download the ImageNet-2012 to the folder ./data/imagenet and move the validation set to the subfolder ./data/imagenet/val. To move the validation set, you cloud use the following script: https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh

The directory structure is the standard layout as following.

/path/to/imagenet/
  train/
    class1/
      img1.jpeg
    class2/
      img2.jpeg
  val/
    class1/
      img3.jpeg
    class/2
      img4.jpeg

Model Zoo

For evaluation, we provide the checkpoints of our models in Google Drive.

After downloading the models, you can do the evaluation following the description in Quick Start - Test).

Model download links:

Model FLOPs Top-1 Acc. % Top-5 Acc. % Link
NEAS-S 314M 77.9 93.9 Google Drive
NEAS-M 472M 79.5 94.6 Google Drive
NEAS-L 574M 80.0 94.8 Google Drive

Quick Start

We provide test code of NEAS as follows.

Test

To test our trained models, you need to put the downloaded model in PATH_TO_CKP (the default path is ./CKP in root directory.). After that you need to specify the model path in the corresponding config file by changing the intitial-checkpoint argument in ./configs/subnets/[SELECTED_MODEL_SIZE].yaml.

Then, you could use the following command to test the model.

sh ./tools/distributed_test.sh ./configs/subnets/[SELECTED_MODEL_SIZE].yaml

The test result will be saved in ./experiments. You can also add [--output OUTPUT_PATH] in ./tools/distribution_test.sh to specify a path for it as well.

To Do List

  • Test code
  • Retrain code
  • Search code

BibTex

@article{NEAS,
  title={One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search Space Shrinking},
  author={Chen, Minghao and Peng, Houwen and Fu, Jianlong and Ling, Haibin},
  journal={arXiv preprint arXiv:2104.00597},
  year={2021}
}
You might also like...
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

What is DeepHyper? DeepHyper is a software package that uses learning, optimization, and parallel computing to automate the design and development of

This is the paddle code for SeBoW(Self-Born wiring for neural trees), a kind of neural tree born form a large search space
This is the paddle code for SeBoW(Self-Born wiring for neural trees), a kind of neural tree born form a large search space

SeBoW: Self-Born Wiring for neural trees(PaddlePaddle version) This is the paddle code for SeBoW(Self-Born wiring for neural trees), a kind of neural

Code for "Diversity can be Transferred: Output Diversification for White- and Black-box Attacks"

Output Diversified Sampling (ODS) This is the github repository for the NeurIPS 2020 paper "Diversity can be Transferred: Output Diversification for W

This is the official code of our paper
This is the official code of our paper "Diversity-based Trajectory and Goal Selection with Hindsight Experience Relay" (PRICAI 2021)

Diversity-based Trajectory and Goal Selection with Hindsight Experience Replay This is the official implementation of our paper "Diversity-based Traje

A bare-bones Python library for quality diversity optimization.
A bare-bones Python library for quality diversity optimization.

pyribs Website Source PyPI Conda CI/CD Docs Docs Status Twitter pyribs.org GitHub docs.pyribs.org A bare-bones Python library for quality diversity op

DvD-TD3: Diversity via Determinants for TD3 version

DvD-TD3: Diversity via Determinants for TD3 version The implementation of paper Effective Diversity in Population Based Reinforcement Learning. Instal

[CVPR 2022]
[CVPR 2022] "The Principle of Diversity: Training Stronger Vision Transformers Calls for Reducing All Levels of Redundancy" by Tianlong Chen, Zhenyu Zhang, Yu Cheng, Ahmed Awadallah, Zhangyang Wang

The Principle of Diversity: Training Stronger Vision Transformers Calls for Reducing All Levels of Redundancy Codes for this paper: [CVPR 2022] The Pr

FuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space OptimizationFuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space Optimization
FuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space OptimizationFuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space Optimization

FuseDream This repo contains code for our paper (paper link): FuseDream: Training-Free Text-to-Image Generation with Improved CLIP+GAN Space Optimizat

Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Comments
  • Is there any plan on when to release the train and search code?

    Is there any plan on when to release the train and search code?

    Hi awsome authors of NEAS, I noticed that you have to-do plan for train and search code, I am just curious do you have any plan on when to release those code? I am very interested in your paper, and want to have a try with the search code!

    opened by HeYDwane3 1
Owner
Multimedia Research
Multimedia Research at Microsoft Research Asia
Multimedia Research
ML-Ensemble – high performance ensemble learning

A Python library for high performance ensemble learning ML-Ensemble combines a Scikit-learn high-level API with a low-level computational graph framew

Sebastian Flennerhag 764 Dec 31, 2022
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning This repository is official Tensorflow implementation of paper: Ensemb

Seunghyun Lee 12 Oct 18, 2022
Code for 'Self-Guided and Cross-Guided Learning for Few-shot segmentation. (CVPR' 2021)'

SCL Introduction Code for 'Self-Guided and Cross-Guided Learning for Few-shot segmentation. (CVPR' 2021)' We evaluated our approach using two baseline

null 34 Oct 8, 2022
GEA - Code for Guided Evolution for Neural Architecture Search

Efficient Guided Evolution for Neural Architecture Search Usage Create a conda e

null 6 Jan 3, 2023
Official Pytorch Implementation of: "Semantic Diversity Learning for Zero-Shot Multi-label Classification"(2021) paper

Semantic Diversity Learning for Zero-Shot Multi-label Classification Paper Official PyTorch Implementation Avi Ben-Cohen, Nadav Zamir, Emanuel Ben Bar

null 28 Aug 29, 2022
code for paper "Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?"

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search? Code for paper: Does Unsupervised Architecture Representation

null 39 Dec 17, 2022
Few-shot Neural Architecture Search

One-shot Neural Architecture Search uses a single supernet to approximate the performance each architecture. However, this performance estimation is super inaccurate because of co-adaption among operations in supernet.

Yiyang Zhao 38 Oct 18, 2022
[CVPR 2021] 'Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator'

[CVPR2021] Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator Overview This is the entire codebase for the paper

null 35 Dec 1, 2022
Generating images from caption and vice versa via CLIP-Guided Generative Latent Space Search

CLIP-GLaSS Repository for the paper Generating images from caption and vice versa via CLIP-Guided Generative Latent Space Search An in-browser demo is

Federico Galatolo 172 Dec 22, 2022
Neural Ensemble Search for Performant and Calibrated Predictions

Neural Ensemble Search Introduction This repo contains the code accompanying the paper: Neural Ensemble Search for Performant and Calibrated Predictio

AutoML-Freiburg-Hannover 26 Dec 12, 2022