Bayesian inference for Permuton-induced Chinese Restaurant Process (NeurIPS2021).

Overview

Permuton-induced Chinese Restaurant Process

animationMCMCepinions

Note: Currently only the Matlab version is available, but a Python version will be available soon!

This is a demo code for Bayesian nonparametric relational data analysis based on Permuton-induced Chinese Restaurant Process (NeurIPS, 2021). The key features are listed as follows:

  • Clustering based on rectangular partitioning: For an input matrix, the algorithm probabilistically searches for the row and column order and rectangular partitioning so that similar elements are clustered in each block as much as possible.
  • Infinite model complexity: There is no need to fix the suitable number of rectangle clusters in advance, which is a fundamental principle of Bayesian nonparametric machine learning.
  • Arbitrary rectangular partitioning: It can potentially obtain a posterior distribution on arbitrary rectangular partitioning with any numbers of rectangle blocks.
  • Empirically faster mixing of Markov chain Monte Carlo (MCMC) iterations: The method most closely related to this algorithm is the Baxter Permutation Process (NeurIPS, 2020). Typically, this algorithm seems to be able to mix MCMC faster than the Baxter permutation process empirically.

You will need a basic MATLAB installation with Statistics and Machine Learning Toolbox.

In a nutshell

  1. cd permuton-induced-crp
  2. run

Then, the MCMC evolution will appear like the gif animation at the top of this page. The following two items are particularly noteworthy.

  • Top center: Probabilistic rectangular partitioning of a sample matrix (irmdata\sampledata.mat ).
  • Bottom right: Posterior probability.

Interpretation of analysis results

model

The details of the visualization that will be drawn while running the MCMC iterations require additional explanation of our model. Please refer to the paper for more details. Our model, an extension of the Chinese Restaurant Process (CRP), consists of a generative probabilistic model as shown in the figure above (taken from the original paper). While the standard CRP achieves sequence clustering by the analogy of placing customers (data) on tables (clusters), our model additionally achieves array clustering by giving the random table coordinates on [0,1]x[0,1] drawn from the permuton. By viewing the table coordinates as a geometric representation of a permutation, we can use the permutation-to-rectangulation transformation to obtain a rectangular partition of the matrix.

  • Bottom center: Random coordinates of the CRP tables on [0,1]x[0,1]. The size of each table (circle) reflects the number of customers sitting at that table.
  • Top left: Diagonal rectangulation corresponding to the permutation represented by the table coordinates.
  • Bottom left: Generic rectangulation corresponding to the permutation represented by the table coordinates.

Details of usage

Given an input relational matrix, the Permuton-induced Chinese Restaurant Process can be fitted to it by a MCMC inference algorithm as follows:

[RowTable, ColumnTable, TableCoordinates, nesw] = test_MCMC_PCRP(X);

or

[RowTable, ColumnTable, TableCoordinates, nesw] = test_MCMC_PCRP(X, opt);

  • X: An M by N input observation matrix. Each element must be natural numbers.
  • opt.maxiter: Maximum number of MCMC iterations.
  • opt.missingRatio: Ratio of test/(training+test) for prediction performance evaluation based on perplexity.

Reference

  1. M. Nakano, Yasuhiro Fujiwara, A. Kimura, T. Yamada, and N. Ueda, 'Permuton-induced Chinese Restaurant Process,' Advances in Neural Information Processing Systems 34 (NeurIPS 2021).

    @inproceedings{Nakano2021,
     author = {Nakano, Masahiro and Fujiwara, Yasuhiro and Kimura, Akisato and Yamada, Takeshi and Ueda, Naonori},
     booktitle = {Advances in Neural Information Processing Systems},
     pages = {},
     publisher = {Curran Associates, Inc.},
     title = {Permuton-induced Chinese Restaurant Process},
     url = {},
     volume = {34},
     year = {2021}
    }
    
You might also like...
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme (NeurIPS2021)
Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme (NeurIPS2021)

Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme (NeurIPS2021) Overview Prerequisites Linux Pytho

[NeurIPS2021] Code Release of K-Net: Towards Unified Image Segmentation

K-Net: Towards Unified Image Segmentation Introduction This is an official release of the paper K-Net:Towards Unified Image Segmentation. K-Net will a

PyTorch implementation of Lip to Speech Synthesis with Visual Context Attentional GAN (NeurIPS2021)
PyTorch implementation of Lip to Speech Synthesis with Visual Context Attentional GAN (NeurIPS2021)

Lip to Speech Synthesis with Visual Context Attentional GAN This repository contains the PyTorch implementation of the following paper: Lip to Speech

MAU: A Motion-Aware Unit for Video Prediction and Beyond, NeurIPS2021

MAU (NeurIPS2021) Zheng Chang, Xinfeng Zhang, Shanshe Wang, Siwei Ma, Yan Ye, Xinguang Xiang, Wen GAo. Official PyTorch Code for "MAU: A Motion-Aware

Torchserve server using a YoloV5 model running on docker with GPU and static batch inference to perform production ready inference.
Torchserve server using a YoloV5 model running on docker with GPU and static batch inference to perform production ready inference.

Yolov5 running on TorchServe (GPU compatible) ! This is a dockerfile to run TorchServe for Yolo v5 object detection model. (TorchServe (PyTorch librar

Monocular 3D pose estimation. OpenVINO. CPU inference or iGPU (OpenCL) inference.
Monocular 3D pose estimation. OpenVINO. CPU inference or iGPU (OpenCL) inference.

human-pose-estimation-3d-python-cpp RealSenseD435 (RGB) 480x640 + CPU Corei9 45 FPS (Depth is not used) 1. Run 1-1. RealSenseD435 (RGB) 480x640 + CPU

PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices.

PyTorch-LIT PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices. With

Data-depth-inference - Data depth inference with python
Data-depth-inference - Data depth inference with python

Welcome! This readme will guide you through the use of the code in this reposito

Owner
NTT Communication Science Laboratories
NTT Communication Science Laboratories
LBK 20 Dec 2, 2022
[ECCV 2020] Gradient-Induced Co-Saliency Detection

Gradient-Induced Co-Saliency Detection Zhao Zhang*, Wenda Jin*, Jun Xu, Ming-Ming Cheng ⭐ Project Home » The official repo of the ECCV 2020 paper Grad

Zhao Zhang 35 Nov 25, 2022
LiDAR Distillation: Bridging the Beam-Induced Domain Gap for 3D Object Detection

LiDAR Distillation Paper | Model LiDAR Distillation: Bridging the Beam-Induced Domain Gap for 3D Object Detection Yi Wei, Zibu Wei, Yongming Rao, Jiax

Yi Wei 75 Dec 22, 2022
aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)

Bayesian Methods for Hackers Using Python and PyMC The Bayesian method is the natural approach to inference, yet it is hidden from readers behind chap

Cameron Davidson-Pilon 25.1k Jan 2, 2023
A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation

Aboleth A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation [1] with stochastic gradient variational Bayes

Gradient Institute 127 Dec 12, 2022
Python Library for learning (Structure and Parameter) and inference (Statistical and Causal) in Bayesian Networks.

pgmpy pgmpy is a python library for working with Probabilistic Graphical Models. Documentation and list of algorithms supported is at our official sit

pgmpy 2.2k Jan 3, 2023
Python package facilitating the use of Bayesian Deep Learning methods with Variational Inference for PyTorch

PyVarInf PyVarInf provides facilities to easily train your PyTorch neural network models using variational inference. Bayesian Deep Learning with Vari

null 342 Dec 2, 2022
LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation (NeurIPS2021 Benchmark and Dataset Track)

LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation by Junjue Wang, Zhuo Zheng, Ailong Ma, Xiaoyan Lu, and Yanfei Zh

Kingdrone 174 Dec 22, 2022
Deep Markov Factor Analysis (NeurIPS2021)

Deep Markov Factor Analysis (DMFA) Codes and experiments for deep Markov factor analysis (DMFA) model accepted for publication at NeurIPS2021: A. Farn

Sarah Ostadabbas 2 Dec 16, 2022
[NeurIPS2021] Exploring Architectural Ingredients of Adversarially Robust Deep Neural Networks

Exploring Architectural Ingredients of Adversarially Robust Deep Neural Networks Code for NeurIPS 2021 Paper "Exploring Architectural Ingredients of A

Hanxun Huang 26 Dec 1, 2022