Julia package for multiway (inverse) covariance estimation.

Overview

TensorGraphicalModels

TensorGraphicalModels.jl is a suite of Julia tools for estimating high-dimensional multiway (tensor-variate) covariance and inverse covariance matrices.

Installation

] add https://github.com/ywa136/TensorGraphicalModels.jl

Examples

Please check out a Julia colab created for illustration of some functionalities of the package. Here are some basic examples as well:

Example code for fitting a KP inverse covariance model:

using TensorGraphicalModels

model_type = "kp"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list) #multi-dimensional array (tensor) of dimension d_1 × … × d_K × N
Ψ_hat_list = kglasso(X)

Example code for fitting a KS inverse covariance model:

using TensorGraphicalModels

model_type = "ks"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Ψ_hat_list, _ = teralasso(X_kGram)

Example code for fitting a Sylvester inverse covariance model:

using TensorGraphicalModels

model_type = "sylvester"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Psi0 = [sparse(eye(d_list[k])) for k = 1:K]
fun = (iter, Psi) -> [1, time()] # NULL func
lambda = [sqrt(px[k] * log(prod(d_list)) / N) for k = 1:K] 

Ψ_hat_list, _ = syglasso_palm(X, X_kGram, lambda, Psi0, fun = fun)

Example code for fitting a KPCA covariance model:

using TensorGraphicalModels

px = py = 25 #works for K=2 modes only
N = 100
X = zeros((px * py, N))

for i=1:N
    X[:, i] .= vec(rand(MatrixNormal(zeros((px, py)), ScalMat(px, 2.0), ScalMat(py, 4.0))))
end

S = cov(copy(X')) #sample covariance matrix
lambdaL = 20 * (px^2 + py^2 + log(max(px, py, N))) / N
lambdaS = 20 * sqrt(log(px * py)/N)

# robust Kronecker PCA methods using singular value thresholding
Sigma_hat = robust_kron_pca(S, px, py, lambdaL, lambdaS, "SVT"; tau = 0.5, r = 5)
Comments
  • CompatHelper: add new compat entry for Plots at version 1, (keep existing compat)

    CompatHelper: add new compat entry for Plots at version 1, (keep existing compat)

    This pull request sets the compat entry for the Plots package to 1. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
  • CompatHelper: add new compat entry for Kronecker at version 0.5, (keep existing compat)

    CompatHelper: add new compat entry for Kronecker at version 0.5, (keep existing compat)

    This pull request sets the compat entry for the Kronecker package to 0.5. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
  • CompatHelper: add new compat entry for Distributions at version 0.25, (keep existing compat)

    CompatHelper: add new compat entry for Distributions at version 0.25, (keep existing compat)

    This pull request sets the compat entry for the Distributions package to 0.25. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
  • CompatHelper: add new compat entry for MatrixEquations at version 2, (keep existing compat)

    CompatHelper: add new compat entry for MatrixEquations at version 2, (keep existing compat)

    This pull request sets the compat entry for the MatrixEquations package to 2. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
  • CompatHelper: add new compat entry for SpecialMatrices at version 2, (keep existing compat)

    CompatHelper: add new compat entry for SpecialMatrices at version 2, (keep existing compat)

    This pull request sets the compat entry for the SpecialMatrices package to 2. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
  • CompatHelper: add new compat entry for GLMNet at version 0.7, (keep existing compat)

    CompatHelper: add new compat entry for GLMNet at version 0.7, (keep existing compat)

    This pull request sets the compat entry for the GLMNet package to 0.7. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request. Note: Consider registering a new release of your package immediately after merging this PR, as downstream packages may depend on this for tests to pass.

    opened by github-actions[bot] 0
Owner
Wayne Wang
Ph.D. candidate in statistics
Wayne Wang
Facebook Research 605 Jan 2, 2023
The Pytorch code of "Joint Distribution Matters: Deep Brownian Distance Covariance for Few-Shot Classification", CVPR 2022 (Oral).

DeepBDC for few-shot learning        Introduction In this repo, we provide the implementation of the following paper: "Joint Distribution Matters: Dee

FeiLong 116 Dec 19, 2022
Pythonic particle-based (super-droplet) warm-rain/aqueous-chemistry cloud microphysics package with box, parcel & 1D/2D prescribed-flow examples in Python, Julia and Matlab

PySDM PySDM is a package for simulating the dynamics of population of particles. It is intended to serve as a building block for simulation systems mo

Atmospheric Cloud Simulation Group @ Jagiellonian University 32 Oct 18, 2022
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 20.2k Jan 8, 2023
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 20.2k Jan 5, 2023
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 19.3k Feb 12, 2021
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Master Docs License Apache MXNet (incubating) is a deep learning framework designed for both efficiency an

ROCm Software Platform 29 Nov 16, 2022
Perspective: Julia for Biologists

Perspective: Julia for Biologists 1. Examples Speed: Example 1 - Single cell data and network inference Domain: Single cell data Methodology: Network

Elisabeth Roesch 55 Dec 2, 2022
Calling Julia from Python - an experiment on data loading

Calling Julia from Python - an experiment on data loading See the slides. TLDR After reading Patrick's blog post, we decided to try to replace C++ wit

Abel Siqueira 8 Jun 7, 2022
MacroTools provides a library of tools for working with Julia code and expressions.

MacroTools.jl MacroTools provides a library of tools for working with Julia code and expressions. This includes a powerful template-matching system an

FluxML 278 Dec 11, 2022
Numba-accelerated Pythonic implementation of MPDATA with examples in Python, Julia and Matlab

PyMPDATA PyMPDATA is a high-performance Numba-accelerated Pythonic implementation of the MPDATA algorithm of Smolarkiewicz et al. used in geophysical

Atmospheric Cloud Simulation Group @ Jagiellonian University 15 Nov 23, 2022
✔️ Visual, reactive testing library for Julia. Time machine included.

PlutoTest.jl (alpha release) Visual, reactive testing library for Julia A macro @test that you can use to verify your code's correctness. But instead

Pluto 68 Dec 20, 2022
Python and Julia in harmony.

PythonCall & JuliaCall Bringing Python® and Julia together in seamless harmony: Call Python code from Julia and Julia code from Python via a symmetric

Christopher Rowley 414 Jan 7, 2023
A little software to generate and save Julia or Mandelbrot's Fractals.

Julia-Mandelbrot-s-Fractals A little software to generate and save Julia or Mandelbrot's Fractals. Dependencies : Python 3.7 or more. (Also possible t

Olivier 0 Jul 9, 2022
Code, Data and Demo for Paper: Controllable Generation from Pre-trained Language Models via Inverse Prompting

InversePrompting Paper: Controllable Generation from Pre-trained Language Models via Inverse Prompting Code: The code is provided in the "chinese_ip"

THUDM 101 Dec 16, 2022
Code for PhySG: Inverse Rendering with Spherical Gaussians for Physics-based Relighting and Material Editing

PhySG: Inverse Rendering with Spherical Gaussians for Physics-based Relighting and Material Editing CVPR 2021. Project page: https://kai-46.github.io/

Kai Zhang 141 Dec 14, 2022
Code for the paper "JANUS: Parallel Tempered Genetic Algorithm Guided by Deep Neural Networks for Inverse Molecular Design"

JANUS: Parallel Tempered Genetic Algorithm Guided by Deep Neural Networks for Inverse Molecular Design This repository contains code for the paper: JA

Aspuru-Guzik group repo 55 Nov 29, 2022
The official implementation of the research paper "DAG Amendment for Inverse Control of Parametric Shapes"

DAG Amendment for Inverse Control of Parametric Shapes This repository is the official Blender implementation of the paper "DAG Amendment for Inverse

Elie Michel 157 Dec 26, 2022
SNIPS: Solving Noisy Inverse Problems Stochastically

SNIPS: Solving Noisy Inverse Problems Stochastically This repo contains the official implementation for the paper SNIPS: Solving Noisy Inverse Problem

Bahjat Kawar 35 Nov 9, 2022