Non-stationary GP package written from scratch in PyTorch

Overview

NSGP-Torch

Examples

gpytorch model with skgpytorch

# Import packages
import torch
from regdata import NonStat2D
from gpytorch.kernels import RBFKernel, ScaleKernel
from skgpytorch.models import ExactGPRegressor
from skgpytorch.metrics import mean_squared_error

# Hyperparameters
n_iters = 100

# Load data
datafunc = NonStat2D(backend="torch")
X_train, y_train, X_test = map(lambda x: x.to(torch.float32), datafunc.get_data())
y_test = datafunc.f(X_test[:, 0], X_test[:, 1]).to(torch.float32)

# Define a kernel
kernel = ScaleKernel(RBFKernel(ard_num_dims=X_train.shape[1]))

# Define a model 
model = ExactGPRegressor(X_train, y_train, kernel, device='cpu')

# Train the model
model.fit(n_iters=n_iters, random_state=seed)

# Predict the distribution
pred_dist = model.predict(X_train, y_train, X_test)

# Compute RMSE and/or NLPD
mse = mean_squared_error(pred_dist, y_test, squared=False)
nlpd = neg_log_posterior_density(pred_dist, y_test)

nsgptorch model with skgpytorch

# Import packages
import torch
from regdata import NonStat2D

from nsgptorch.kernels import rbf

from skgpytorch.models import ExactNSGPRegressor
from skgpytorch.metrics import mean_squared_error

# Hyperparameters
n_iters = 100

# Load data
datafunc = NonStat2D(backend="torch")
X_train, y_train, X_test = map(lambda x: x.to(torch.float32), datafunc.get_data())
y_test = datafunc.f(X_test[:, 0], X_test[:, 1]).to(torch.float32)

# Define a kernel list for each dimension
kernel_list = [rbf, rbf]

# Define inducing points for each dimension (must be none if not applicable)
inducing_points = [None, None]

# Define a model 
model = ExactNSGPRegressor(kernel_list, input_dim=2, inducing_points, device='cpu')

# Train the model
model.fit(X_train, y_train, n_iters=n_iters, random_state=seed)

# Predict the distribution
pred_dist = model.predict(X_train, y_train, X_test)

# Compute RMSE and/or NLPD
mse = mean_squared_error(pred_dist, y_test, squared=False)
nlpd = neg_log_posterior_density(pred_dist, y_test)

Plan

  • Each kernel is 1D
  • Multiply kernels to each other

Ideas

  • Compute distance once and save it
  • Update skgpytorch to use 1 std instead of 0.1
  • Do something about mean learning of gpytorch for comparison
You might also like...
Keras like implementation of Deep Learning architectures from scratch using numpy.

Mini-Keras Keras like implementation of Deep Learning architectures from scratch using numpy. How to contribute? The project contains implementations

This repo is to be freely used by ML devs to check the GAN performances without coding from scratch.
This repo is to be freely used by ML devs to check the GAN performances without coding from scratch.

GANs for Fun Created because I can! GOAL The goal of this repo is to be freely used by ML devs to check the GAN performances without coding from scrat

Train a state-of-the-art yolov3 object detector from scratch!
Train a state-of-the-art yolov3 object detector from scratch!

TrainYourOwnYOLO: Building a Custom Object Detector from Scratch This repo let's you train a custom image detector using the state-of-the-art YOLOv3 c

ML From Scratch

ML from Scratch MACHINE LEARNING TOPICS COVERED - FROM SCRATCH Linear Regression Logistic Regression K Means Clustering K Nearest Neighbours Decision

In this project, we create and implement a deep learning library from scratch.
In this project, we create and implement a deep learning library from scratch.

ARA In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The

Create and implement a deep learning library from scratch.
Create and implement a deep learning library from scratch.

In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The Proj

AbelNN: Deep Learning Python module from scratch

AbelNN: Deep Learning Python module from scratch I have implemented several neural networks from scratch using only Numpy. I have designed the module

Weighted K Nearest Neighbors (kNN) algorithm implemented on python from scratch.
Weighted K Nearest Neighbors (kNN) algorithm implemented on python from scratch.

kNN_From_Scratch I implemented the k nearest neighbors (kNN) classification algorithm on python. This algorithm is used to predict the classes of new

Controlling the MicriSpotAI robot from scratch
Controlling the MicriSpotAI robot from scratch

Project-MicroSpot-AI Controlling the MicriSpotAI robot from scratch Colaborators Alexander Dennis Components from MicroSpot The MicriSpotAI has the fo

Releases(v0.1.2)
Owner
Zeel B Patel
Ph.D. student at sustainability lab
Zeel B Patel
A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection

Confluence: A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection 1. 介绍 用以替代 NMS,在所有 bbox 中挑选出最优的集合。 NMS 仅考虑了 bbox 的得分,然后根据 IOU 来

null 44 Sep 15, 2022
A non-linear, non-parametric Machine Learning method capable of modeling complex datasets

Fast Symbolic Regression Symbolic Regression is a non-linear, non-parametric Machine Learning method capable of modeling complex data sets. fastsr aim

VAMSHI CHOWDARY 3 Jun 22, 2022
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.

SmallPebble Project status: experimental, unstable. SmallPebble is a minimal/toy automatic differentiation/deep learning library written from scratch

Sidney Radcliffe 92 Dec 30, 2022
A PyTorch Lightning solution to training OpenAI's CLIP from scratch.

train-CLIP ?? A PyTorch Lightning solution to training CLIP from scratch. Goal ⚽ Our aim is to create an easy to use Lightning implementation of OpenA

Cade Gordon 396 Dec 30, 2022
A pytorch implementation of Detectron. Both training from scratch and inferring directly from pretrained Detectron weights are available.

Use this instead: https://github.com/facebookresearch/maskrcnn-benchmark A Pytorch Implementation of Detectron Example output of e2e_mask_rcnn-R-101-F

Roy 2.8k Dec 29, 2022
Code implementation from my Medium blog post: [Transformers from Scratch in PyTorch]

transformer-from-scratch Code for my Medium blog post: Transformers from Scratch in PyTorch Note: This Transformer code does not include masked attent

Frank Odom 27 Dec 21, 2022
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Machine Learning From Scratch About Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The purpose

Erik Linder-Norén 21.8k Jan 9, 2023
Sample code from the Neural Networks from Scratch book.

Neural Networks from Scratch (NNFS) book code Code from the NNFS book (https://nnfs.io) separated by chapter.

Harrison 172 Dec 31, 2022
Scripts of Machine Learning Algorithms from Scratch. Implementations of machine learning models and algorithms using nothing but NumPy with a focus on accessibility. Aims to cover everything from basic to advance.

Algo-ScriptML Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The goal of this project is not t

Algo Phantoms 81 Nov 26, 2022
RoBERTa Marathi Language model trained from scratch during huggingface 🤗 x flax community week

RoBERTa base model for Marathi Language (मराठी भाषा) Pretrained model on Marathi language using a masked language modeling (MLM) objective. RoBERTa wa

Nipun Sadvilkar 23 Oct 19, 2022