ZSL-KG is a general-purpose zero-shot learning framework with a novel transformer graph convolutional network (TrGCN) to learn class representation from common sense knowledge graphs.

Overview

ZSL-KG

ZSL-KG is a general-purpose zero-shot learning framework with a novel transformer graph convolutional network (TrGCN) to learn class representation from common sense knowledge graphs.

Build Status

Reference paper: Zero-shot Learning with Common Sense Knowledge graphs.

alt text

Performance

Performance ZSL-KG compared to other existing graph-based zero-shot learning frameworks.

Method Ontonotes (Strict) BBN (Strict) SNIPS-NLU (Acc.) AWA2 (H) aPY (H) ImageNet (All T-1) Avg.
GCNZ 41.5 21.5 82.5 73.3 58.1 1.0 46.3
SGCN 42.6 24.9 50.3 73.7 56.8 1.5 41.6
DGP 41.1 24.0 64.4 75.1 55.7 1.4 43.6
ZSL-KG 45.2 26.7 89.0 74.6 61.6 1.7 49.8

ZSL-KG outperforms existing graph-based frameworks on five out of six benchmark datasets.

For more details on the experiments, refer to nayak-arxiv20-code.

Installation

The package requires python >= 3.7. To install the package, type the following command:

pip install .

Example Usage

In our framework, we use AutoGNN to easily create graph neural networks for zero-shot learning.

from zsl_kg.class_encoder.auto_gnn import AutoGNN
from zsl_kg.common.graph import NeighSampler

trgcn = {
    "input_dim": 300,
    "output_dim": 2049,
    "type": "trgcn",
    "gnn": [
        {
            "input_dim": 300,
            "output_dim": 2048,
            "activation": nn.LeakyReLU(0.2),
            "normalize": True,
            "sampler": NeighSampler(100, mode="topk"),
            "fh": 100,
        },
        {
            "input_dim": 2048,
            "output_dim": 2049,
            "activation": None,
            "normalize": True,
            "sampler": NeighSampler(50, mode="topk"),
        },
    ],
}

class_encoder = AutoGNN(trgcn)

Our framework supports the following graph neural networks: gcn, gat, rgcn, lstm, trgcn. You can change the type to any of the available to graph neural networks to instantly create a new graph neural network.

For more examples, refer to nayak-arxiv20-code.

Run Tests

To run the tests, please type the following command:

pytest

Citation

Please cite the following paper if you are using our framework.

@article{nayak:arxiv20,
  Author = {Nayak, Nihal V. and Bach, Stephen H.},
  Title = {Zero-Shot Learning with Common Sense Knowledge Graphs},
  Volume = {arXiv:2006.10713 [cs.LG]},
  Year = {2020}}
You might also like...
Codes and models for the paper "Learning Unknown from Correlations: Graph Neural Network for Inter-novel-protein Interaction Prediction".

GNN_PPI Codes and models for the paper "Learning Unknown from Correlations: Graph Neural Network for Inter-novel-protein Interaction Prediction". Lear

Episodic Transformer (E.T.) is a novel attention-based architecture for vision-and-language navigation. E.T. is based on a multimodal transformer that encodes language inputs and the full episode history of visual observations and actions.
Neighborhood Contrastive Learning for Novel Class Discovery

Neighborhood Contrastive Learning for Novel Class Discovery This repository contains the official implementation of our paper: Neighborhood Contrastiv

Awesome Deep Graph Clustering is a collection of SOTA, novel deep graph clustering methods

ADGC: Awesome Deep Graph Clustering ADGC is a collection of state-of-the-art (SOTA), novel deep graph clustering methods (papers, codes and datasets).

The code for the CVPR 2021 paper Neural Deformation Graphs, a novel approach for globally-consistent deformation tracking and 3D reconstruction of non-rigid objects.
The code for the CVPR 2021 paper Neural Deformation Graphs, a novel approach for globally-consistent deformation tracking and 3D reconstruction of non-rigid objects.

Neural Deformation Graphs Project Page | Paper | Video Neural Deformation Graphs for Globally-consistent Non-rigid Reconstruction Aljaž Božič, Pablo P

Code for the AAAI 2022 paper "Zero-Shot Cross-Lingual Machine Reading Comprehension via Inter-Sentence Dependency Graph".

multilingual-mrc-isdg Code for the AAAI 2022 paper "Zero-Shot Cross-Lingual Machine Reading Comprehension via Inter-Sentence Dependency Graph". This r

Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.

PyTorch Implementation of Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers 1 Using Colab Please notic

sense-py-AnishaBaishya created by GitHub Classroom

Compute Statistics Here we compute statistics for a bunch of numbers. This project uses the unittest framework to test functionality. Pass the tests T

This repo is the code release of EMNLP 2021 conference paper "Connect-the-Dots: Bridging Semantics between Words and Definitions via Aligning Word Sense Inventories".

Connect-the-Dots: Bridging Semantics between Words and Definitions via Aligning Word Sense Inventories This repo is the code release of EMNLP 2021 con

Comments
  • Graph usage query in object classification AWA2

    Graph usage query in object classification AWA2

    Hi, I'm not entirely sure if this is where I should be asking this but I was wondering if you could help me clarify a question.

    1 When performing zero shot object classification in AWA2, what is the graph that is used? (I assume a knowledge graph such as ConceptNet?)

    2 If this is a knowledge graph, than what is the "input" to the trgcn? As in where are these "concepts" coming from, is it from the class attributes?

    opened by shadowbourne 2
  • Quick note

    Quick note

    Dear Team behind @zsl-kg,

    This framework is really cool, however, I was quite disappointed to see you are re-writing everything from scratch when you could have used an awesome framework like PyTorch Geometric.

    Best regards, Thomas Chaton

    opened by tchaton 1
Releases(v0.0.1)
Neural models of common sense. 🤖

Unicorn on Rainbow Neural models of common sense. This repository is for the paper: Unicorn on Rainbow: A Universal Commonsense Reasoning Model on a N

AI2 60 Jan 5, 2023
Code Repo for the ACL21 paper "Common Sense Beyond English: Evaluating and Improving Multilingual LMs for Commonsense Reasoning"

Common Sense Beyond English: Evaluating and Improving Multilingual LMs for Commonsense Reasoning This is the Github repository of our paper, "Common S

INK Lab @ USC 19 Nov 30, 2022
IntraQ: Learning Synthetic Images with Intra-Class Heterogeneity for Zero-Shot Network Quantization

IntraQ: Learning Synthetic Images with Intra-Class Heterogeneity for Zero-Shot Network Quantization paper Requirements Python >= 3.7.10 Pytorch == 1.7

null 1 Nov 19, 2021
BYOL for Audio: Self-Supervised Learning for General-Purpose Audio Representation

BYOL for Audio: Self-Supervised Learning for General-Purpose Audio Representation This is a demo implementation of BYOL for Audio (BYOL-A), a self-sup

NTT Communication Science Laboratories 160 Jan 4, 2023
Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

zshicode 1 Nov 18, 2021
a general-purpose Transformer based vision backbone

Swin Transformer By Ze Liu*, Yutong Lin*, Yue Cao*, Han Hu*, Yixuan Wei, Zheng Zhang, Stephen Lin and Baining Guo. This repo is the official implement

Microsoft 9.9k Jan 8, 2023
Unofficial PyTorch implementation of MobileViT based on paper "MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer".

MobileViT RegNet Unofficial PyTorch implementation of MobileViT based on paper MOBILEVIT: LIGHT-WEIGHT, GENERAL-PURPOSE, AND MOBILE-FRIENDLY VISION TR

Hong-Jia Chen 91 Dec 2, 2022
General purpose GPU compute framework for cross vendor graphics cards (AMD, Qualcomm, NVIDIA & friends)

General purpose GPU compute framework for cross vendor graphics cards (AMD, Qualcomm, NVIDIA & friends). Blazing fast, mobile-enabled, asynchronous and optimized for advanced GPU data processing usecases. Backed by the Linux Foundation.

The Kompute Project 1k Jan 6, 2023
Pytorch implementation of Cut-Thumbnail in the paper Cut-Thumbnail:A Novel Data Augmentation for Convolutional Neural Network.

Cut-Thumbnail (Accepted at ACM MULTIMEDIA 2021) Tianshu Xie, Xuan Cheng, Xiaomin Wang, Minghui Liu, Jiali Deng, Tao Zhou, Ming Liu This is the officia

null 3 Apr 12, 2022