Node-level Graph Regression with Deep Gaussian Process Models

Overview

Node-level Graph Regression with Deep Gaussian Process Models

Prerequests

our implementation is mainly based on tensorflow 1.x and gpflow 1.x:

python 3.x (3.7 tested)
conda install tensorflow-gpu==1.15
pip install keras==2.3.1
pip install gpflow==1.5
pip install gpuinfo

Besides, some basic packages like numpy are also needed. It's maybe easy to wrap the codes for TF2.0 and GPflow2, but it's not tested yet.

Specification

Source code and experiment result are both provided. Unzip two archive files before using experiment notebooks.

Files

  • dgp_graph/: cores codes of the DGPG model.
    • impl_parallel.py: a fast node-level computation parallelized implementation, invoked by all experiments.
    • my_op.py: some custom tensorflow operations used in the implementation.
    • impl.py: a basic loop-based implementation, easy to understand but not practical, leaving just for calibration.
  • data/: datasets.
  • doubly_stochastic_dgp/: codes from repository DGP
  • compatible/: codes to make the DGP source codes compatible with gpflow1.5.
  • gpflow_monitor/: monitoring tool for gpflow models, from this repo.
  • GRN inference: code and data for the GRN inference experiment.
  • demo_city45.ipynb: jupyter notebooks for city45 dataset experiment.
  • experiments.zip: jupyter notebooks for other experiments.
  • results.zip: contains original jupyter notebooks results. (exported as HTML files for archive)
  • run_toy.sh: shell script to run additional experiment.
  • toy_main.py: code for additional experiment (Traditional ML methods and DGPG with linear kernel).
  • ER-0.1.ipynb: example script for analyzing time-varying graph structures.

Experiments

The experiments are based on python src files and demonstrated by jupyter notebooks. The source of an experiment is under directory src/experiments.zip and the corresponding result is exported as a static HTML file stored in the directory results.zip. They are organized by dataset names:

  1. Synthetic Datasets

For theoretical analysis.

  • demo_toy_run1.ipynb

  • demo_toy_run2.ipynb

  • demo_toy_run3.ipynb

  • demo_toy_run4.ipynb

  • demo_toy_run5.ipynb

For graph signal analysis on time-varying graphs.

  • ER-0.05.ipynb

  • ER-0.2.ipynb

  • RWP-0.1.ipynb

  • RWP-0.2.ipynb

  • RWP-0.3.ipynb

  1. Small Datasets
  • demo_city45.ipynb
  • demo_city45_linear.ipynb (linear kernel)
  • demo_city45_baseline.ipynb (traditional regression methods)
  • demo_etex.ipynb
  • demo_etex_linear.ipynb
  • demo_etex_baseline.ipynb
  • demo_fmri.ipynb
  • demo_fmri_linear.ipynb
  • demo_fmri_baseline.ipynb
  1. Large Datasets (traffic flow prediction)
  • LA
    • demo_la_15min.ipynb
    • demo_la_30min.ipynb
    • demo_la_60min.ipynb
  • BAY
    • demo_bay_15min.ipynb
    • demo_bay_30min.ipynb
    • demo_bay_60min.ipynb
You might also like...
A PyTorch Implementation of
A PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).

Attention Walk ⠀⠀ A PyTorch Implementation of Watch Your Step: Learning Node Embeddings via Graph Attention (NIPS 2018). Abstract Graph embedding meth

G-NIA model from
G-NIA model from "Single Node Injection Attack against Graph Neural Networks" (CIKM 2021)

Single Node Injection Attack against Graph Neural Networks This repository is our Pytorch implementation of our paper: Single Node Injection Attack ag

Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Machine Learning From Scratch About Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The purpose

Awesome Deep Graph Clustering is a collection of SOTA, novel deep graph clustering methods

ADGC: Awesome Deep Graph Clustering ADGC is a collection of state-of-the-art (SOTA), novel deep graph clustering methods (papers, codes and datasets).

HiddenMarkovModel implements hidden Markov models with Gaussian mixtures as distributions on top of TensorFlow

Class HiddenMarkovModel HiddenMarkovModel implements hidden Markov models with Gaussian mixtures as distributions on top of TensorFlow 2.0 Installatio

This repository holds the code for the paper "Deep Conditional Gaussian Mixture Model forConstrained Clustering".

Deep Conditional Gaussian Mixture Model for Constrained Clustering. This repository holds the code for the paper Deep Conditional Gaussian Mixture Mod

Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising
Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising

Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising

Pytorch Implementation of Adversarial Deep Network Embedding for Cross-Network Node Classification

Pytorch Implementation of Adversarial Deep Network Embedding for Cross-Network Node Classification (ACDNE) This is a pytorch implementation of the Adv

Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch
Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch

Transformer in Transformer Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image c

Owner
null
A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation

Aboleth A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation [1] with stochastic gradient variational Bayes

Gradient Institute 127 Dec 12, 2022
Official implementation of deep Gaussian process (DGP)-based multi-speaker speech synthesis with PyTorch.

Multi-speaker DGP This repository provides official implementation of deep Gaussian process (DGP)-based multi-speaker speech synthesis with PyTorch. O

sarulab-speech 24 Sep 7, 2022
Code to reproduce the experiments from our NeurIPS 2021 paper " The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective"

Code To run: python runner.py new --save <SAVE_NAME> --data <PATH_TO_DATA_DIR> --dataset <DATASET> --model <model_name> [options] --n 1000 - train - t

Geoff Pleiss 5 Dec 12, 2022
Self-supervised learning on Graph Representation Learning (node-level task)

graph_SSL Self-supervised learning on Graph Representation Learning (node-level task) How to run the code To run GRACE, sh run_GRACE.sh To run GCA, sh

Namkyeong Lee 3 Dec 31, 2021
Automatically replace ONNX's RandomNormal node with Constant node.

onnx-remove-random-normal This is a script to replace RandomNormal node with Constant node. Example Imagine that we have something ONNX model like the

Masashi Shibata 1 Dec 11, 2021
Newt - a Gaussian process library in JAX.

Newt __ \/_ (' \`\ _\, \ \\/ /`\/\ \\ \ \\

AaltoML 0 Nov 2, 2021
Multi-Output Gaussian Process Toolkit

Multi-Output Gaussian Process Toolkit Paper - API Documentation - Tutorials & Examples The Multi-Output Gaussian Process Toolkit is a Python toolkit f

GAMES 113 Nov 25, 2022
Quantile Regression DQN a Minimal Working Example, Distributional Reinforcement Learning with Quantile Regression

Quantile Regression DQN Quantile Regression DQN a Minimal Working Example, Distributional Reinforcement Learning with Quantile Regression (https://arx

Arsenii Senya Ashukha 80 Sep 17, 2022
Hitters Linear Regression - Hitters Linear Regression With Python

Hitters_Linear_Regression Kullanacağımız veri seti Carnegie Mellon Üniversitesi'

AyseBuyukcelik 2 Jan 26, 2022
Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Datset)

Graphlevel-SSL Overview Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Dataset). It is unified framework to co

JunSeok 8 Oct 15, 2021