From Canonical Correlation Analysis to Self-supervised Graph Neural Networks

Overview

[NeurIPS 2021]-From Canonical Correlation Analysis to Self-supervised Graph Neural Networks

Code for CCA-SSG model proposed in the NeurIPS 2021 paper From Canonical Correlation Analysis to Self-supervised Graph Neural Networks.

Dependencies

  • Python 3.7
  • PyTorch 1.7.1
  • dgl 0.6.0

Datasets

Citation Networks: 'Cora', 'Citeseer' and 'Pubmed'.

Co-occurence Networks: 'Amazon-Computer', 'Amazon-Photo', 'Coauthor-CS' and 'Coauthor-Physics'.

Dataset # Nodes # Edges # Classes # Features
Cora 2,708 10,556 7 1,433
Citeseer 3,327 9,228 6 3,703
Pubmed 19,717 88,651 3 500
Amazon-Computer 13,752 574,418 10 767
Amazon-Photo 7,650 287,326 8 745
Coauthor-CS 18,333 327,576 15 6,805
Coauthor-Physics 34,493 991,848 5 8,451

Usage

To run the codes, use the following commands:

# Cora
python main.py --dataname cora --epochs 50 --lambd 1e-3 --dfr 0.1 --der 0.4 --lr2 1e-2 --wd2 1e-4

# Citeseer
python main.py --dataname citeseer --epochs 20 --n_layers 1 --lambd 5e-4 --dfr 0.0 --der 0.4 --lr2 1e-2 --wd2 1e-2

# Pubmed
python main.py --dataname pubmed --epochs 100 --lambd 1e-3 --dfr 0.3 --der 0.5 --lr2 1e-2 --wd2 1e-4

# Amazon-Computer
python main.py --dataname comp --epochs 50 --lambd 5e-4 --dfr 0.1 --der 0.3 --lr2 1e-2 --wd2 1e-4

# Amazon-Photo
python main.py --dataname photo --epochs 50 --lambd 1e-3 --dfr 0.2 --der 0.3 --lr2 1e-2 --wd2 1e-4

# Coauthor-CS
python main.py --dataname cs --epochs 50 --lambd 1e-3 --dfr 0.2 --lr2 5e-3 --wd2 1e-4 --use_mlp

# Coauthor-Physics
python main.py --dataname physics --epochs 100 --lambd 1e-3 --dfr 0.5 --der 0.5 --lr2 5e-3 --wd2 1e-4

Reference

If our paper and code are useful for your research, please cite the following article:

@inproceedings{zhang2021canonical,
  title={From canonical correlation analysis to self-supervised graph neural networks},
  author={Zhang, Hengrui and Wu, Qitian and Yan, Junchi and Wipf, David and Philip, S Yu},
  booktitle={Thirty-Fifth Conference on Neural Information Processing Systems},
  year={2021}
}
Comments
  • Question about linear evaluation

    Question about linear evaluation

    Generally, only the validation set is used to help select the best_model during the training. So I wonder, is it appropriate to use test_acc to help select the eval_acc? image

    opened by THINK2TRY 4
  • About the loss the inconsistency between the loss function of the code and the paper

    About the loss the inconsistency between the loss function of the code and the paper

    Congratulations on the great paper and on being accepted to NIPS! I have a problem with the loss function. The code corresponding to the self-supervised learning loss function is on lines 75 and 88 of 'main.py'. The formulation of the code is as follows. image image However, the formulation in the paper should be: image image They are not consistent. Why is this the case?

    opened by cjissmart 4
  • About 1/sqrt(N) std

    About 1/sqrt(N) std

    I'm very impressed by your works. Thanks.

    Is there any reason or insight that you normalize node representations with std 1/ sqrt(N) rather than 1?

    Also, in my work, the result was fine when i normalize with setting std as you. But i'm wondering about the reason. If you have any insights or refetences, feel free to answer me.

    Any small answer may be a huge pleasure.

    opened by SeongJinAhn 2
  • A question about correlation matrix

    A question about correlation matrix

    Hi Hengtui,

    May I ask one question about your paper?

    In Section F.1, you visualize the correlation matrix. However, I didn't find any formal definitions for the correlation matrix. Is it the same as the covariance matrix mentioned in Section 3.2?

    Best regards, Simon

    opened by brave-simon 2
  • The way of evaluation

    The way of evaluation

    Hi, author ! Thanks for your code! But I have a puzzle about the evaluation. In 147~150 lines of main.py, you first decide if the new val_acc is bigger than original one. If this condition is satisfied, you then decide the test_acc in the same way. However, during the training of logreg, the test shouldn't be used in my opinion, while only validation is ok. Is there any problem?

    opened by liun-online 2
Owner
Hengrui Zhang
Hengrui Zhang
Finite-temperature variational Monte Carlo calculation of uniform electron gas using neural canonical transformation.

CoulombGas This code implements the neural canonical transformation approach to the thermodynamic properties of uniform electron gas. Building on JAX,

FermiFlow 9 Mar 3, 2022
A static analysis library for computing graph representations of Python programs suitable for use with graph neural networks.

python_graphs This package is for computing graph representations of Python programs for machine learning applications. It includes the following modu

Google Research 258 Dec 29, 2022
The Self-Supervised Learner can be used to train a classifier with fewer labeled examples needed using self-supervised learning.

Published by SpaceML • About SpaceML • Quick Colab Example Self-Supervised Learner The Self-Supervised Learner can be used to train a classifier with

SpaceML 92 Nov 30, 2022
Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Datset)

Graphlevel-SSL Overview Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Dataset). It is unified framework to co

JunSeok 8 Oct 15, 2021
An official source code for paper Deep Graph Clustering via Dual Correlation Reduction, accepted by AAAI 2022

Dual Correlation Reduction Network An official source code for paper Deep Graph Clustering via Dual Correlation Reduction, accepted by AAAI 2022. Any

yueliu1999 109 Dec 23, 2022
Official implementation of Self-supervised Graph Attention Networks (SuperGAT), ICLR 2021.

SuperGAT Official implementation of Self-supervised Graph Attention Networks (SuperGAT). This model is presented at How to Find Your Friendly Neighbor

Dongkwan Kim 127 Dec 28, 2022
A PyTorch implementation of "Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning", IJCAI-21

MERIT A PyTorch implementation of our IJCAI-21 paper Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning. Depen

Graph Analysis & Deep Learning Laboratory, GRAND 32 Jan 2, 2023
Canonical Appearance Transformations

CAT-Net: Learning Canonical Appearance Transformations Code to accompany our paper "How to Train a CAT: Learning Canonical Appearance Transformations

STARS Laboratory 54 Dec 24, 2022
This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning].

CG3 This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning]. R

null 12 Oct 28, 2022
TilinGNN: Learning to Tile with Self-Supervised Graph Neural Network (SIGGRAPH 2020)

TilinGNN: Learning to Tile with Self-Supervised Graph Neural Network (SIGGRAPH 2020) About The goal of our research problem is illustrated below: give

null 59 Dec 9, 2022
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)

S2-BNN (Self-supervised Binary Neural Networks Using Distillation Loss) This is the official pytorch implementation of our paper: "S2-BNN: Bridging th

Zhiqiang Shen 52 Dec 24, 2022
The source code of the paper "Understanding Graph Neural Networks from Graph Signal Denoising Perspectives"

GSDN-F and GSDN-EF This repository provides a reference implementation of GSDN-F and GSDN-EF as described in the paper "Understanding Graph Neural Net

Guoji Fu 18 Nov 14, 2022
Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

zshicode 1 Nov 18, 2021
On Size-Oriented Long-Tailed Graph Classification of Graph Neural Networks

On Size-Oriented Long-Tailed Graph Classification of Graph Neural Networks We provide the code (in PyTorch) and datasets for our paper "On Size-Orient

Zemin Liu 4 Jun 18, 2022
[CVPR 2021] "The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models" Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Michael Carbin, Zhangyang Wang

The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models Codes for this paper The Lottery Tickets Hypo

VITA 59 Dec 28, 2022
Patch Rotation: A Self-Supervised Auxiliary Task for Robustness and Accuracy of Supervised Models

Patch-Rotation(PatchRot) Patch Rotation: A Self-Supervised Auxiliary Task for Robustness and Accuracy of Supervised Models Submitted to Neurips2021 To

null 4 Jul 12, 2021
Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR

UniSpeech The family of UniSpeech: UniSpeech (ICML 2021): Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR UniSpeech-

Microsoft 282 Jan 9, 2023
Official PyTorch implementation for paper Context Matters: Graph-based Self-supervised Representation Learning for Medical Images

Context Matters: Graph-based Self-supervised Representation Learning for Medical Images Official PyTorch implementation for paper Context Matters: Gra

null 49 Nov 23, 2022