= 0.4.0 (with suitable CUDA and CuDNN version) tor" /> = 0.4.0 (with suitable CUDA and CuDNN version) tor" /> = 0.4.0 (with suitable CUDA and CuDNN version) tor"/>

Code Release for Learning to Adapt to Evolving Domains

Related tags

Deep Learning EAML
Overview

EAML

Code release for "Learning to Adapt to Evolving Domains" (NeurIPS 2020)

Prerequisites

  • PyTorch >= 0.4.0 (with suitable CUDA and CuDNN version)
  • torchvision >= 0.2.1
  • Python3
  • Numpy
  • argparse
  • PIL

Dataset

Rotated MNIST: https://drive.google.com/file/d/1eaw42sg4Cgm34790AW_SKGCSkFosugl2/view?usp=sharing

Training

EAML 

%run eaml.py rot_mnist_28/ --lip-balance 0.2 --lip-jth 0.01 --epochs 500 --lr-in 0.03 --lr-out 0.003 

JAN 

%run JAN.py rot_mnist_28/ --lip-balance 0.2 --lip-jth 0.01 --epochs 500 --lr-in 0.03 --lr-out 0.003

Source 

%run source.py rot_mnist_28/ --lip-balance 0.2 --lip-jth 0.01 --epochs 500 --lr-out 0.003

Acknowledgement

This code is implemented based on the JAN (Joint Adaptation Networks) code, and it is our pleasure to acknowledge their contributions. The meta-learning code is adapted from https://github.com/dragen1860/MAML-Pytorch/.

Citation

If you use this code for your research, please consider citing:

@inproceedings{NEURIPS2020_fd69dbe2,
 author = {Liu, Hong and Long, Mingsheng and Wang, Jianmin and Wang, Yu},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
 pages = {22338--22348},
 publisher = {Curran Associates, Inc.},
 title = {Learning to Adapt to Evolving Domains},
 url = {https://proceedings.neurips.cc/paper/2020/file/fd69dbe29f156a7ef876a40a94f65599-Paper.pdf},
 volume = {33},
 year = {2020}
}


Contact

If you have any problem about our code, feel free to contact

You might also like...
Code Release for ICCV 2021 (oral), "AdaFit: Rethinking Learning-based Normal Estimation on Point Clouds"

AdaFit: Rethinking Learning-based Normal Estimation on Point Clouds (ICCV 2021 oral) **Project Page | Arxiv ** Runsong Zhu¹, Yuan Liu², Zhen Dong¹, Te

Deep generative modeling for time-stamped heterogeneous data, enabling high-fidelity models for a large variety of spatio-temporal domains.
Deep generative modeling for time-stamped heterogeneous data, enabling high-fidelity models for a large variety of spatio-temporal domains.

Neural Spatio-Temporal Point Processes [arxiv] Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel Abstract. We propose a new class of parameterizations

A Comprehensive Analysis of Weakly-Supervised Semantic Segmentation in Different Image Domains (IJCV submission)
A Comprehensive Analysis of Weakly-Supervised Semantic Segmentation in Different Image Domains (IJCV submission)

wsss-analysis The code of: A Comprehensive Analysis of Weakly-Supervised Semantic Segmentation in Different Image Domains, arXiv pre-print 2019 paper.

Graph-based community clustering approach to extract protein domains from a predicted aligned error matrix
Graph-based community clustering approach to extract protein domains from a predicted aligned error matrix

Using a predicted aligned error matrix corresponding to an AlphaFold2 model , returns a series of lists of residue indices, where each list corresponds to a set of residues clustering together into a pseudo-rigid domain.

This is the official Pytorch implementation of the paper
This is the official Pytorch implementation of the paper "Diverse Motion Stylization for Multiple Style Domains via Spatial-Temporal Graph-Based Generative Model"

Diverse Motion Stylization (Official) This is the official Pytorch implementation of this paper. Diverse Motion Stylization for Multiple Style Domains

Code release for NeX: Real-time View Synthesis with Neural Basis Expansion
Code release for NeX: Real-time View Synthesis with Neural Basis Expansion

NeX: Real-time View Synthesis with Neural Basis Expansion Project Page | Video | Paper | COLAB | Shiny Dataset We present NeX, a new approach to novel

The code release of paper 'Domain Generalization for Medical Imaging Classification with Linear-Dependency Regularization' NIPS 2020.
The code release of paper 'Domain Generalization for Medical Imaging Classification with Linear-Dependency Regularization' NIPS 2020.

Domain Generalization for Medical Imaging Classification with Linear Dependency Regularization The code release of paper 'Domain Generalization for Me

Code release for
Code release for "Transferable Semantic Augmentation for Domain Adaptation" (CVPR 2021)

Transferable Semantic Augmentation for Domain Adaptation Code release for "Transferable Semantic Augmentation for Domain Adaptation" (CVPR 2021) Paper

This is the official code release for the paper Shape and Material Capture at Home
This is the official code release for the paper Shape and Material Capture at Home

This is the official code release for the paper Shape and Material Capture at Home. The code enables you to reconstruct a 3D mesh and Cook-Torrance BRDF from one or more images captured with a flashlight or camera with flash.

Comments
  • Rotated MNIST: for each rotation, we have access to 100 samples of images?

    Rotated MNIST: for each rotation, we have access to 100 samples of images?

    Hi there,

    Thank you for this insightful work. As mentioned in the paper that "For each rotation, we have access to 100 samples of images". I suppose it means that the used training data only consists of 100 samples for each rotation.

    However, I cannot find any specification in the code that reveals this. Could you please specify? correct me if I got it wrong, thank you.

    Kind regards, Arya

    opened by LoKerpiqo 0
  • Rotated MNIST License

    Rotated MNIST License

    Thank you for adding the Rotated MNIST dataset. Since you modified the dataset and created the splits, could you also add a license to describe the terms of use ?

    opened by debasmitdas 0
Owner
Undergraduate student majoring in electronic engineering
null
Code for "Graph-Evolving Meta-Learning for Low-Resource Medical Dialogue Generation". [AAAI 2021]

Graph Evolving Meta-Learning for Low-resource Medical Dialogue Generation Code to be further cleaned... This repo contains the code of the following p

Shuai Lin 29 Nov 1, 2022
NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training

NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training

Göktuğ Karakaşlı 16 Dec 5, 2022
Evolving neural network parameters in JAX.

Evolving Neural Networks in JAX This repository holds code displaying techniques for applying evolutionary network training strategies in JAX. Each sc

Trevor Thackston 6 Feb 12, 2022
[CVPR 2022] PoseTriplet: Co-evolving 3D Human Pose Estimation, Imitation, and Hallucination under Self-supervision (Oral)

PoseTriplet: Co-evolving 3D Human Pose Estimation, Imitation, and Hallucination under Self-supervision Kehong Gong*, Bingbing Li*, Jianfeng Zhang*, Ta

null 256 Dec 28, 2022
Code for sound field predictions in domains with impedance boundaries. Used for generating results from the paper

Code for sound field predictions in domains with impedance boundaries. Used for generating results from the paper

DTU Acoustic Technology Group 11 Dec 17, 2022
Softlearning is a reinforcement learning framework for training maximum entropy policies in continuous domains. Includes the official implementation of the Soft Actor-Critic algorithm.

Softlearning Softlearning is a deep reinforcement learning toolbox for training maximum entropy policies in continuous domains. The implementation is

Robotic AI & Learning Lab Berkeley 997 Dec 30, 2022
Code release for BlockGAN: Learning 3D Object-aware Scene Representations from Unlabelled Images

BlockGAN Code release for BlockGAN: Learning 3D Object-aware Scene Representations from Unlabelled Images BlockGAN: Learning 3D Object-aware Scene Rep

null 41 May 18, 2022
Code release for "Self-Tuning for Data-Efficient Deep Learning" (ICML 2021)

Self-Tuning for Data-Efficient Deep Learning This repository contains the implementation code for paper: Self-Tuning for Data-Efficient Deep Learning

THUML @ Tsinghua University 101 Dec 11, 2022
Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"

CoTuning Official implementation for NeurIPS 2020 paper Co-Tuning for Transfer Learning. [News] 2021/01/13 The COCO 70 dataset used in the paper is av

THUML @ Tsinghua University 35 Sep 23, 2022