Principled Detection of Out-of-Distribution Examples in Neural Networks

Overview

ODIN: Out-of-Distribution Detector for Neural Networks

This is a PyTorch implementation for detecting out-of-distribution examples in neural networks. The method is described in the paper Principled Detection of Out-of-Distribution Examples in Neural Networks by S. Liang, Yixuan Li and R. Srikant. The method reduces the false positive rate from the baseline 34.7% to 4.3% on the DenseNet (applied to CIFAR-10) when the true positive rate is 95%.

Experimental Results

We used two neural network architectures, DenseNet-BC and Wide ResNet. The PyTorch implementation of DenseNet-BC is provided by Andreas Veit and Brandon Amos. The PyTorch implementation of Wide ResNet is provided by Sergey Zagoruyko. The experimental results are shown as follows. The definition of each metric can be found in the paper. performance

Pre-trained Models

We provide four pre-trained neural networks: (1) two DenseNet-BC networks trained on CIFAR-10 and CIFAR-100 respectively; (2) two Wide ResNet networks trained on CIFAR-10 and CIFAR-100 respectively. The test error rates are given by:

Architecture CIFAR-10 CIFAR-100
DenseNet-BC 4.81 22.37
Wide ResNet 3.71 19.86

Running the code

Dependencies

  • CUDA 8.0

  • PyTorch

  • Anaconda2 or 3

  • At least three GPU

    Note: Reproducing results of DenseNet-BC only requires one GPU, but reproducing results of Wide ResNet requires three GPUs. Single GPU version for Wide ResNet will be released soon in the future.

Downloading Out-of-Distribtion Datasets

We provide download links of five out-of-distributin datasets:

Here is an example code of downloading Tiny-ImageNet (crop) dataset. In the root directory, run

mkdir data
cd data
wget https://www.dropbox.com/s/avgm2u562itwpkl/Imagenet.tar.gz
tar -xvzf Imagenet.tar.gz
cd ..

Downloading Neural Network Models

We provide download links of four pre-trained models.

Here is an example code of downloading DenseNet-BC trained on CIFAR-10. In the root directory, run

mkdir models
cd models
wget https://www.dropbox.com/s/wr4kjintq1tmorr/densenet10.pth.tar.gz
tar -xvzf densenet10.pth.tar.gz
cd ..

Running

Here is an example code reproducing the results of DenseNet-BC trained on CIFAR-10 where TinyImageNet (crop) is the out-of-distribution dataset. The temperature is set as 1000, and perturbation magnitude is set as 0.0014. In the root directory, run

cd code
# model: DenseNet-BC, in-distribution: CIFAR-10, out-distribution: TinyImageNet (crop)
# magnitude: 0.0014, temperature 1000, gpu: 0
python main.py --nn densenet10 --out_dataset Imagenet --magnitude 0.0014 --temperature 1000 --gpu 0

Note: Please choose arguments according to the following.

args

  • args.nn: the arguments of neural networks are shown as follows

    Nerual Network Models args.nn
    DenseNet-BC trained on CIFAR-10 densenet10
    DenseNet-BC trained on CIFAR-100 densenet100
  • args.out_dataset: the arguments of out-of-distribution datasets are shown as follows

    Out-of-Distribution Datasets args.out_dataset
    Tiny-ImageNet (crop) Imagenet
    Tiny-ImageNet (resize) Imagenet_resize
    LSUN (crop) LSUN
    LSUN (resize) LSUN_resize
    iSUN iSUN
    Uniform random noise Uniform
    Gaussian random noise Gaussian
  • args.magnitude: the optimal noise magnitude can be found below. In practice, the optimal choices of noise magnitude are model-specific and need to be tuned accordingly.

    Out-of-Distribution Datasets densenet10 densenet100 wideresnet10 wideresnet100
    Tiny-ImageNet (crop) 0.0014 0.0014 0.0005 0.0028
    Tiny-ImageNet (resize) 0.0014 0.0028 0.0011 0.0028
    LSUN (crop) 0 0.0028 0 0.0048
    LSUN (resize) 0.0014 0.0028 0.0006 0.002
    iSUN 0.0014 0.0028 0.0008 0.0028
    Uniform random noise 0.0014 0.0028 0.0014 0.0028
    Gaussian random noise 0.0014 0.0028 0.0014 0.0028
  • args.temperature: temperature is set to 1000 in all cases.

  • args.gpu: make sure you use the following gpu when running the code:

    Neural Network Models args.gpu
    densenet10 0
    densenet100 0
    wideresnet10 1
    wideresnet100 2

Outputs

Here is an example of output.

Neural network architecture:          DenseNet-BC-100
In-distribution dataset:                     CIFAR-10
Out-of-distribution dataset:     Tiny-ImageNet (crop)

                          Baseline         Our Method
FPR at TPR 95%:              34.8%               4.3% 
Detection error:              9.9%               4.6%
AUROC:                       95.3%              99.1%
AUPR In:                     96.4%              99.2%
AUPR Out:                    93.8%              99.1%
Comments
  • How to search hyperparameters for a new model?

    How to search hyperparameters for a new model?

    Hi - thanks for your work.

    I would like to use your method on another architecture.

    I see that there are three hyperparameters:

    1. Perturbation magnitude (epsilon)
    2. Temperature (T)
    3. Threshold (delta)

    How should I choose these hyperparams using the OOD validation set?

    I understand that the OOD validation set should be different from the ones used for testing, but I'm unsure which of these hyperparams I should prioritise.

    Should I choose do a coarse grid search for all of these?

    Thank you!

    opened by ashok-arjun 2
  • Question about calData.py

    Question about calData.py

    At line 63 ~65 of calData.py, you normalize the gradient to the same space of image as follows:

        gradient[0][0] = (gradient[0][0] )/(63.0/255.0)
        gradient[0][1] = (gradient[0][1] )/(62.1/255.0)
        gradient[0][2] = (gradient[0][2])/(66.7/255.0)
    

    I want to know the reason why you introduce above normalization and how to choose the values, i.e., 63.0, 62.1 and 66.7.

    Thanks in advance, Kimin Lee.

    opened by pokaxpoka 2
  • Could you release the code for training DenseNet and Resnet models?

    Could you release the code for training DenseNet and Resnet models?

    I trained a new DenseNet121 with code from https://github.com/kuangliu/pytorch-cifar Loss: 0.179 | Acc: 95.580% However, it seems ODIN doesn't work on this model.

    temperature =  200  epsilon =  0.000064
    Neural network architecture:          DenseNet
    In-distribution dataset:                     CIFAR-10
    Out-of-distribution dataset:     Tiny-ImageNet (crop)
    
                              Baseline         Our Method
    FPR at TPR 95%:              39.2%              27.8% 
    Detection error:             11.3%              12.3%
    AUROC:                       93.2%              92.2%
    AUPR In:                     91.7%              87.7%
    AUPR Out:                    92.0%              93.1%
    

    This is the best ODIN performance I can get when "temperature = 200 and epsilon = 0.000064" which is worse than the baseline.

    Is this project related to models? Could you release the code for training DenseNet and Resnet models in this project? Thanks!

    opened by yw981 1
  • Question about the gradient normalization

    Question about the gradient normalization

    https://github.com/ShiyuLiang/odin-pytorch/blob/34e53f5a982811a0d74baba049538d34efc0732d/code/calData.py#L61 Hi, When I read the code about the gradient normalization part, I have some questions. What's the meaning for 63, 62.1 and 66.7, and where can I learn these settings? Thank you~

    opened by superkevingit 1
  • ‘BatchNorm2d’ object has no attribute ‘track_running_stats’

    ‘BatchNorm2d’ object has no attribute ‘track_running_stats’

    Hi, I'am using pytorch version 0.4.0 and i get that error in that line

    out = self.conv1(self.relu(self.bn1(x))) # line 34 on densenet.py

    The error message looks like that:

    File "/opt/odin-pytorch/code/densenet.py", line 34, in forward out = self.conv1(self.relu(self.bn1(x))) File "/usr/local/lib/python3.5/dist-packages/torch/nn/modules/module.py", line 491, in call result = self.forward(*input, **kwargs) File "/usr/local/lib/python3.5/dist-packages/torch/nn/modules/batchnorm.py", line 49, in forward self.training or not self.track_running_stats, self.momentum, self.eps) File "/usr/local/lib/python3.5/dist-packages/torch/nn/modules/module.py", line 532, in getattr type(self).name, name))

    I found that other repositories have similar issues when upgrading from pytorch 0.3.0 to 0.4.0: https://discuss.pytorch.org/t/batchnorm2d-object-has-no-attribute-track-running-stats/17525/6

    Should I use pytorch 0.3.x?

    opened by Jihunlee326 5
  • Pre-trained WideResNet for CIFAR-100 Missing

    Pre-trained WideResNet for CIFAR-100 Missing

    Hello, thank you for providing all of this code for your experiments!

    I was trying to download the pre-trained model weights for the WideResNets and it seems that both the CIFAR-10 version and the CIFAR-100 version lead to the same version (CIFAR-10). Is it possible to update the link so that it leads to the proper model?

    Thanks!

    opened by TDeVries 1
Owner
null
Propose a principled and practically effective framework for unsupervised accuracy estimation and error detection tasks with theoretical analysis and state-of-the-art performance.

Detecting Errors and Estimating Accuracy on Unlabeled Data with Self-training Ensembles This project is for the paper: Detecting Errors and Estimating

Jiefeng Chen 13 Nov 21, 2022
Official implementation for Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder at NeurIPS 2020

Likelihood-Regret Official implementation of Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder at NeurIPS 2020. T

Xavier 33 Oct 12, 2022
Outlier Exposure with Confidence Control for Out-of-Distribution Detection

OOD-detection-using-OECC This repository contains the essential code for the paper Outlier Exposure with Confidence Control for Out-of-Distribution De

Nazim Shaikh 64 Nov 2, 2022
The Official Implementation of the ICCV-2021 Paper: Semantically Coherent Out-of-Distribution Detection.

SCOOD-UDG (ICCV 2021) This repository is the official implementation of the paper: Semantically Coherent Out-of-Distribution Detection Jingkang Yang,

Jake YANG 62 Nov 21, 2022
Code for EMNLP 2021 paper Contrastive Out-of-Distribution Detection for Pretrained Transformers.

Contra-OOD Code for EMNLP 2021 paper Contrastive Out-of-Distribution Detection for Pretrained Transformers. Requirements PyTorch Transformers datasets

Wenxuan Zhou 27 Oct 28, 2022
RODD: A Self-Supervised Approach for Robust Out-of-Distribution Detection

RODD Official Implementation of 2022 CVPRW Paper RODD: A Self-Supervised Approach for Robust Out-of-Distribution Detection Introduction: Recent studie

Umar Khalid 17 Oct 11, 2022
ColossalAI-Examples - Examples of training models with hybrid parallelism using ColossalAI

ColossalAI-Examples This repository contains examples of training models with Co

HPC-AI Tech 185 Jan 9, 2023
Official PyTorch implementation of the Fishr regularization for out-of-distribution generalization

Fishr: Invariant Gradient Variances for Out-of-distribution Generalization Official PyTorch implementation of the Fishr regularization for out-of-dist

null 62 Dec 22, 2022
Official repository for CVPR21 paper "Deep Stable Learning for Out-Of-Distribution Generalization".

StableNet StableNet is a deep stable learning method for out-of-distribution generalization. This is the official repo for CVPR21 paper "Deep Stable L

null 120 Dec 28, 2022
Code for EMNLP'21 paper "Types of Out-of-Distribution Texts and How to Detect Them"

ood-text-emnlp Code for EMNLP'21 paper "Types of Out-of-Distribution Texts and How to Detect Them" Files fine_tune.py is used to finetune the GPT-2 mo

Udit Arora 19 Oct 28, 2022
Training Confidence-Calibrated Classifier for Detecting Out-of-Distribution Samples / ICLR 2018

Training Confidence-Calibrated Classifier for Detecting Out-of-Distribution Samples This project is for the paper "Training Confidence-Calibrated Clas

null 168 Nov 29, 2022
Codebase for Amodal Segmentation through Out-of-Task andOut-of-Distribution Generalization with a Bayesian Model

Codebase for Amodal Segmentation through Out-of-Task andOut-of-Distribution Generalization with a Bayesian Model

Yihong Sun 12 Nov 15, 2022
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)

S2-BNN (Self-supervised Binary Neural Networks Using Distillation Loss) This is the official pytorch implementation of our paper: "S2-BNN: Bridging th

Zhiqiang Shen 52 Dec 24, 2022
A certifiable defense against adversarial examples by training neural networks to be provably robust

DiffAI v3 DiffAI is a system for training neural networks to be provably robust and for proving that they are robust. The system was developed for the

SRI Lab, ETH Zurich 202 Dec 13, 2022
This is a model made out of Neural Network specifically a Convolutional Neural Network model

This is a model made out of Neural Network specifically a Convolutional Neural Network model. This was done with a pre-built dataset from the tensorflow and keras packages. There are other alternative libraries that can be used for this purpose, one of which is the PyTorch library.

null 9 Oct 18, 2022
Cancer-and-Tumor-Detection-Using-Inception-model - In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks, specifically here the Inception model by google.

Cancer-and-Tumor-Detection-Using-Inception-model In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks

Deepak Nandwani 1 Jan 1, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021
gym-anm is a framework for designing reinforcement learning (RL) environments that model Active Network Management (ANM) tasks in electricity distribution networks.

gym-anm is a framework for designing reinforcement learning (RL) environments that model Active Network Management (ANM) tasks in electricity distribution networks. It is built on top of the OpenAI Gym toolkit.

Robin Henry 99 Dec 12, 2022
Multi-Agent Reinforcement Learning for Active Voltage Control on Power Distribution Networks (MAPDN)

Multi-Agent Reinforcement Learning for Active Voltage Control on Power Distribution Networks (MAPDN) This is the implementation of the paper Multi-Age

Future Power Networks 83 Jan 6, 2023