Official Pytorch implementation of "Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Overview

Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Official Pytorch implementation of Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Setup

This setting requires CUDA 11. However, you can still use your own environment by installing requirements including PyTorch and Torchvision.

  1. Install conda environment and activate it
conda env create -f environment.yml
conda activate biascon
  1. Prepare dataset.
  • Biased MNIST
    By default, we set download=True for convenience.
    Thus, you only have to make the empty dataset directory with mkdir -p data/biased_mnist and run the code.

  • CelebA
    Download CelebA dataset under data/celeba

  • UTKFace
    Download UTKFace dataset under data/utk_face

  • ImageNet & ImageNet-A
    We use ILSVRC 2015 ImageNet dataset.
    Download ImageNet under ./data/imagenet and ImageNet-A under ./data/imagenet-a

Biased MNIST (w/ bias labels)

We use correlation {0.999, 0.997, 0.995, 0.99, 0.95, 0.9}.

Bias-contrastive loss (BiasCon)

python train_biased_mnist_bc.py --corr 0.999 --seed 1

Bias-balancing loss (BiasBal)

python train_biased_mnist_bb.py --corr 0.999 --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_biased_mnist_bc.py --bb 1 --corr 0.999 --seed 1

CelebA

We assess CelebA dataset with target attributes of HeavyMakeup (--task makeup) and Blonde (--task blonde).

Bias-contrastive loss (BiasCon)

python train_celeba_bc.py --task makeup --seed 1

Bias-balancing loss (BiasBal)

python train_celeba_bb.py --task makeup --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_celeba_bc.py --bb 1 --task makeup --seed 1

UTKFace

We assess UTKFace dataset biased toward Race (--task race) and Age (--task age) attributes.

Bias-contrastive loss (BiasCon)

python train_utk_face_bc.py --task race --seed 1

Bias-balancing loss (BiasBal)

python train_utk_face_bb.py --task race --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_utk_face_bc.py --bb 1 --task race --seed 1

Biased MNIST (w/o bias labels)

We use correlation {0.999, 0.997, 0.995, 0.99, 0.95, 0.9}.

Soft Bias-contrastive loss (SoftCon)

  1. Train a bias-capturing model and get bias features.
python get_biased_mnist_bias_features.py --corr 0.999 --seed 1
  1. Train a model with bias features.
python train_biased_mnist_softcon.py --corr 0.999 --seed 1

ImageNet

We use texture cluster information from ReBias (Bahng et al., 2020).

Soft Bias-contrastive loss (SoftCon)

  1. Train a bias-capturing model and get bias features.
python get_imagenet_bias_features.py --seed 1
  1. Train a model with bias features.
python train_imagenet_softcon.py --seed 1
You might also like...
Allele-specific pipeline for unbiased read mapping(WIP), QTL discovery(WIP), and allelic-imbalance analysis

WASP2 (Currently in pre-development): Allele-specific pipeline for unbiased read mapping(WIP), QTL discovery(WIP), and allelic-imbalance analysis Requ

Re-implementation of the Noise Contrastive Estimation algorithm for pyTorch, following "Noise-contrastive estimation: A new estimation principle for unnormalized statistical models." (Gutmann and Hyvarinen, AISTATS 2010)

Noise Contrastive Estimation for pyTorch Overview This repository contains a re-implementation of the Noise Contrastive Estimation algorithm, implemen

[NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”,
[NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”,

Improving Contrastive Learning on Imbalanced Data via Open-World Sampling Introduction Contrastive learning approaches have achieved great success in

[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning
[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning

SoCo [NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning By Fangyun Wei*, Yue Gao*, Zhirong Wu, Han Hu,

Code repo for
Code repo for "RBSRICNN: Raw Burst Super-Resolution through Iterative Convolutional Neural Network" (Machine Learning and the Physical Sciences workshop in NeurIPS 2021).

RBSRICNN: Raw Burst Super-Resolution through Iterative Convolutional Neural Network An official PyTorch implementation of the RBSRICNN network as desc

Official pytorch implementation of "Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization" ACMMM 2021 (Oral)

Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization This is an official implementation of "Feature Stylization and Domain-

Official Pytorch Implementation of:
Official Pytorch Implementation of: "Semantic Diversity Learning for Zero-Shot Multi-label Classification"(2021) paper

Semantic Diversity Learning for Zero-Shot Multi-label Classification Paper Official PyTorch Implementation Avi Ben-Cohen, Nadav Zamir, Emanuel Ben Bar

BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalanced Tongue Data

Balanced-Evolutionary-Semi-Stacking Code for the paper ''BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalan

An official TensorFlow implementation of “CLCC: Contrastive Learning for Color Constancy” accepted at CVPR 2021.
An official TensorFlow implementation of “CLCC: Contrastive Learning for Color Constancy” accepted at CVPR 2021.

CLCC: Contrastive Learning for Color Constancy (CVPR 2021) Yi-Chen Lo*, Chia-Che Chang*, Hsuan-Chao Chiu, Yu-Hao Huang, Chia-Ping Chen, Yu-Lin Chang,

Comments
  • Clarifications on augmentation

    Clarifications on augmentation

    Hi and thanks for your very interesting work!

    I have a couple questions regarding the transformation for the contrastive loss.

    • In your paper you mention we first construct 2N size "multiviewed batch" by applying a pair of randomized augmentation (e.g., random crop, random flip) on N data samples in a single batch. Looking at Eqs. 1, 2 and 3 I understand that the same starting data (batch of N) is used for cross entropy and then augmented to 2N and used for contrastive loss. However I do not understand why you use a different dataloader for the contrastive loss https://github.com/grayhong/bias-contrastive-learning/blob/35562b6d700356c3a3c7346781f0310ca38f6fd1/train_biased_mnist_bc.py#L157 Wouldn't that result in effectively loading more data than the original N samples?

    • Why do you apply hue jitter on Biased-MNIST? https://github.com/grayhong/bias-contrastive-learning/blob/35562b6d700356c3a3c7346781f0310ca38f6fd1/debias/datasets/biased_mnist.py#L233

    Thanks in advance!

    opened by carloalbertobarbano 0
Owner
Youngkyu
Machine Learning Engineer / Backend Engineer
Youngkyu
Pytorch implementation of the AAAI 2022 paper "Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification"

[AAAI22] Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification We point out the overlooked unbiasedness in long-tailed clas

PatatiPatata 28 Oct 18, 2022
Official PyTorch implementation of "ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows"

ArtFlow Official PyTorch implementation of the paper: ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows Jie An*, Siyu Huang*, Yibing

null 123 Dec 27, 2022
This is an official PyTorch implementation of Task-Adaptive Neural Network Search with Meta-Contrastive Learning (NeurIPS 2021, Spotlight).

NeurIPS 2021 (Spotlight): Task-Adaptive Neural Network Search with Meta-Contrastive Learning This is an official PyTorch implementation of Task-Adapti

Wonyong Jeong 15 Nov 21, 2022
PyTorch code for ICLR 2021 paper Unbiased Teacher for Semi-Supervised Object Detection

Unbiased Teacher for Semi-Supervised Object Detection This is the PyTorch implementation of our paper: Unbiased Teacher for Semi-Supervised Object Detection

Facebook Research 366 Dec 28, 2022
Simple Tensorflow implementation of Toward Spatially Unbiased Generative Models (ICCV 2021)

Spatial unbiased GANs — Simple TensorFlow Implementation [Paper] : Toward Spatially Unbiased Generative Models (ICCV 2021) Abstract Recent image gener

Junho Kim 16 Apr 15, 2022
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021

Directed Graph Contrastive Learning The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL). In this paper, we present the first con

Tong Zekun 28 Jan 8, 2023
Toward Spatially Unbiased Generative Models (ICCV 2021)

Toward Spatially Unbiased Generative Models Implementation of Toward Spatially Unbiased Generative Models (ICCV 2021) Overview Recent image generation

Jooyoung Choi 88 Dec 1, 2022
[ICCV 2021] Released code for Causal Attention for Unbiased Visual Recognition

CaaM This repo contains the codes of training our CaaM on NICO/ImageNet9 dataset. Due to my recent limited bandwidth, this codebase is still messy, wh

Wang Tan 66 Dec 31, 2022
A pytorch-version implementation codes of paper: "BSN++: Complementary Boundary Regressor with Scale-Balanced Relation Modeling for Temporal Action Proposal Generation"

BSN++: Complementary Boundary Regressor with Scale-Balanced Relation Modeling for Temporal Action Proposal Generation A pytorch-version implementation

null 11 Oct 8, 2022