The code for our NeurIPS 2021 paper "Kernelized Heterogeneous Risk Minimization".

Overview

Kernelized-HRM

Jiashuo Liu, Zheyuan Hu

The code for our NeurIPS 2021 paper "Kernelized Heterogeneous Risk Minimization"[1]. This repo contains the codes for our Classification with Spurious Correlation and Regression with Selection Bias simulated experiments, including the data generation process, the whole Kernelized-HRM algorithm and the testing process.

Details

There are two files, named KernelHRM_sim1.py and KernelHRM_sim2.py, which contains the code for the classification simulation experiment and the regression simulation experiment, respectively. The details of codes are:

  • generate_data_list: generate data according to the given parameters args.r_list.

  • generate_test_data_list: generate the test data for Selection Bias experiment, where the args.r_list is pre-defined to [-2.9,-2.7,...,-1.9].

  • main_KernelHRM: the whole framework for our Kernelized-HRM algorithm.

Hypermeters

There are many hyper-parameters to be tuned for the whole framework, which are different among different tasks and require users to carefully tune. Note that although we provide the hyper-parameters for the simulated experiments, it is possible that the results are not exactly the same as ours, which may due to the randomness or something else.

Generally, the following hyper-parameters need carefully tuned:

  • k: controls the dimension of reduced neural tangent features
  • whole_epoch: controls the overall number of iterations between the frontend and the backend
  • epochs: controls the number of epochs of optimizing the invariant learning module in each iteration
  • IRM_lam: controls the strength of the regularizer for the invariant learning
  • lr: learning rate
  • cluster_num: controls the number of clusters

Further, for the experimental settings, the following parameters need to be specified:

  • r_list: controls the strength of spurious correlations
  • scramble: similar to IRM[2], whether to mix the raw features
  • num_list: controls the number of data points from each environment

As for the optimal hyper-parameters for our simulation experiments, we put them into the reproduce.sh file.

Others

Similar to HRM[3], we view the proposed Kernelized-HRM as a framework, which converts the non-linear and complicated data into linear and raw feature data by neural tangent kernel and includes the clustering module and the invariant prediction module. In practice, one can replace each model to anything they want with the same effect.

Though I hate to mention it, our method has the following shortcomings:

  • Just like the original HRM[3], the convergence of the frontend module cannot be guaranteed, and we notice that there may be some cases the next iteration does not improve the current results or even hurts.
  • Hyper-parameters for different tasks may be quite different and need to be tuned carefully.
  • Whether this algorithm can be extended to more complicated image data, such as PACS, NICO et al. remains to be seen.(Maybe later we will have a try?)

Reference

[1] Jiasuho Liu, Zheyuan Hu, Peng Cui, Bo Li, Zheyan Shen. Kernelized Heterogeneous Risk Minimization. In NeurIPS 2021.

[2] Arjovsky M, Bottou L, Gulrajani I, et al. Invariant risk minimization.

[3] Jiashuo Liu, Zheyuan Hu, Peng Cui, Bo Li, Zheyan Shen. Heterogeneous Risk Minimziation. In ICML 2021.

You might also like...
the code for our CVPR 2021 paper Bilateral Grid Learning for Stereo Matching Network [BGNet]
the code for our CVPR 2021 paper Bilateral Grid Learning for Stereo Matching Network [BGNet]

BGNet This repository contains the code for our CVPR 2021 paper Bilateral Grid Learning for Stereo Matching Network [BGNet] Environment Python 3.6.* C

Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

ConSERT Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer Requirements torch==1.6.0

Code for our paper "SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization", ACL 2021

SimCLS Code for our paper: "SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization", ACL 2021 1. How to Install Requirements

Code for our ACL 2021 paper
Code for our ACL 2021 paper "One2Set: Generating Diverse Keyphrases as a Set"

One2Set This repository contains the code for our ACL 2021 paper “One2Set: Generating Diverse Keyphrases as a Set”. Our implementation is built on the

Code for our CVPR 2021 Paper
Code for our CVPR 2021 Paper "Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes".

Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes (CVPR 2021) Project page | Paper | Colab | Colab for Drawing App Rethinking Style

This is the official code of our paper
This is the official code of our paper "Diversity-based Trajectory and Goal Selection with Hindsight Experience Relay" (PRICAI 2021)

Diversity-based Trajectory and Goal Selection with Hindsight Experience Replay This is the official implementation of our paper "Diversity-based Traje

Source code of our BMVC 2021 paper: AniFormer: Data-driven 3D Animation with Transformer
Source code of our BMVC 2021 paper: AniFormer: Data-driven 3D Animation with Transformer

AniFormer This is the PyTorch implementation of our BMVC 2021 paper AniFormer: Data-driven 3D Animation with Transformer. Haoyu Chen, Hao Tang, Nicu S

Code for our EMNLP 2021 paper “Heterogeneous Graph Neural Networks for Keyphrase Generation”
Code for our EMNLP 2021 paper “Heterogeneous Graph Neural Networks for Keyphrase Generation”

GATER This repository contains the code for our EMNLP 2021 paper “Heterogeneous Graph Neural Networks for Keyphrase Generation”. Our implementation is

Code for our paper Aspect Sentiment Quad Prediction as Paraphrase Generation in EMNLP 2021.

Aspect Sentiment Quad Prediction (ASQP) This repo contains the annotated data and code for our paper Aspect Sentiment Quad Prediction as Paraphrase Ge

Comments
  • Reproduction for other datasets

    Reproduction for other datasets

    Hi, do you plan to provide code for other datasets, like Colored MNIST or House Price? Or could you specify the experimental settings (such as hyper-parameters) for reproducing the results of these datasets?

    Thank you.

    opened by joe0123 1
  • the code can not be run directly

    the code can not be run directly

    KernelHRM_sim1.py: error: unrecognized arguments: --method IGD Traceback (most recent call last): File "KernelHRM_sim2.py", line 14, in from EIIL import LearnedEnvInvariantRiskMinimization. ModuleNotFoundError: No module named 'EIIL'.

    could you give some suggestions?

    opened by jerofree 1
Owner
Liu Jiashuo
THU-TrustAI(THU-TAI) Group
Liu Jiashuo
Code for our NeurIPS 2021 paper 'Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation'

Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation (NeurIPS 2021) Code for our NeurIPS 2021 paper 'Exploiting the Intri

Shiqi Yang 53 Dec 25, 2022
PyTorch implementation for our NeurIPS 2021 Spotlight paper "Long Short-Term Transformer for Online Action Detection".

Long Short-Term Transformer for Online Action Detection Introduction This is a PyTorch implementation for our NeurIPS 2021 Spotlight paper "Long Short

null 77 Dec 16, 2022
This repo includes our code for evaluating and improving transferability in domain generalization (NeurIPS 2021)

Transferability for domain generalization This repo is for evaluating and improving transferability in domain generalization (NeurIPS 2021), based on

gordon 9 Nov 29, 2022
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"

Adam-NSCL This is a PyTorch implementation of Adam-NSCL algorithm for continual learning from our CVPR2021 (oral) paper: Title: Training Networks in N

Shipeng Wang 34 Dec 21, 2022
Companion code for the paper "An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence" (NeurIPS 2021)

ReLU-GP Residual (RGPR) This repository contains code for reproducing the following NeurIPS 2021 paper: @inproceedings{kristiadi2021infinite, title=

Agustinus Kristiadi 4 Dec 26, 2021
This GitHub repository contains code used for plots in NeurIPS 2021 paper 'Stochastic Multi-Armed Bandits with Control Variates.'

About Repository This repository contains code used for plots in NeurIPS 2021 paper 'Stochastic Multi-Armed Bandits with Control Variates.' About Code

Arun Verma 1 Nov 9, 2021
Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

null 6 Dec 19, 2022
Code for NeurIPS 2021 paper: Invariant Causal Imitation Learning for Generalizable Policies

Invariant Causal Imitation Learning for Generalizable Policies Ioana Bica, Daniel Jarrett, Mihaela van der Schaar Neural Information Processing System

Ioana Bica 17 Dec 1, 2022
Code for our ICASSP 2021 paper: SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

SA-Net: Shuffle Attention for Deep Convolutional Neural Networks (paper) By Qing-Long Zhang and Yu-Bin Yang [State Key Laboratory for Novel Software T

Qing-Long Zhang 199 Jan 8, 2023
Code for our CVPR 2021 paper "MetaCam+DSCE"

Joint Noise-Tolerant Learning and Meta Camera Shift Adaptation for Unsupervised Person Re-Identification (CVPR'21) Introduction Code for our CVPR 2021

FlyingRoastDuck 59 Oct 31, 2022