Data and Code for paper Outlining and Filling: Hierarchical Query Graph Generation for Answering Complex Questions over Knowledge Graph is available for research purposes.

Related tags

Deep Learning HGNet
Overview

Data and Code for paper Outlining and Filling: Hierarchical Query Graph Generation for Answering Complex Questions over Knowledge Graph is available for research purposes.

Results

We apply three KGQA benchmarks to evaluate our approach, ComplexWebQuestions (Talmor and Berant, 2018), LC-QuAD (Trivedi et al., 2017), and WebQSP (Yih et al., 2016).

Dataset Structure Acc. Query Graph Acc. Precision Recall F1-score Hit@1
ComplexWebQuestions 66.96 51.68 65.27 68.44 64.95 65.25
LC-QuAD 78.00 60.90 75.82 75.22 75.10 76.00
WebQSP 79.91 62.63 70.22 74.38 70.61 70.37

Requirements

  • Python == 3.7.0
  • cudatoolkit == 10.1.243
  • cudnn == 7.6.5
  • six == 1.15.0
  • torch == 1.4.0
  • transformers == 4.9.2
  • numpy == 1.19.2
  • SPARQLWrapper == 1.8.5
  • rouge_score == 0.0.4
  • filelock == 3.0.12
  • nltk == 3.6.2
  • absl == 0.0
  • dataclasses == 0.6
  • datasets == 1.9.0
  • jsonlines == 2.0.0
  • python_Levenshtein == 0.12.2
  • Virtuoso SPARQL query service

Data

  • Download and unzip our preprocessed data to ./, you can also running our scripts under ./preprocess to obtain them again.

  • Download our used Freebase and DBpedia. Both of them only contain English triples by removing other languages. Download and install Virtuoso to conduct the SPARQL query service for the downloaded Freebase and DBpedia. Here is a tutorial on how to install Virtuoso and import the knowledge graph into it.

  • Download GloVe Embedding glove.42B.300d.txt and put it to your_glove_path.

  • Download our vocabulary from here. Unzip and put it under ./. It contains our used SPARQL cache for Execution-Guided strategy.

Running Code

1. Training for HGNet

Before training, first set the following hyperparameter in train_cwq.sh, train_lcq.sh, and train_wsp.sh.

--glove_path your_glove_path

Execute the following command for training model on ComplexWebQuestions.

sh train_cwq.sh

Execute the following command for training model on LC-QuAD.

sh train_lcq.sh

Execute the following command for training model on WebQSP.

sh train_wsp.sh

The trained model file is saved under ./runs directory.
The path format of the trained model is ./runs/RUN_ID/checkpoints/best_snapshot_epoch_xx_best_val_acc_xx_model.pt.

2. Testing for HGNet

Before testing, need to train a model first and set the following hyperparameters in eval_cwq.sh, eval_lcq.sh, and eval_wsp.sh.

--cpt your_trained_model_path
--kb_endpoint your_sparql_service_ip

You can also directly download our trained models from here. Unzip and put it under ./.

Execute the following command for testing the model on ComplexWebQuestions.

sh eval_cwq.sh

Execute the following command for testing the model on LC-QuAD.

sh eval_lcq.sh

Execute the following command for testing the model on WebQSP.

sh eval_wsp.sh
You might also like...
pytorch bert intent classification and slot filling

pytorch_bert_intent_classification_and_slot_filling 基于pytorch的中文意图识别和槽位填充 说明 基本思路就是:分类+序列标注(命名实体识别)同时训练。 使用的预训练模型:hugging face上的chinese-bert-wwm-ext 依

Implementation for "Manga Filling Style Conversion with Screentone Variational Autoencoder" (SIGGRAPH ASIA 2020 issue)

Manga Filling with ScreenVAE SIGGRAPH ASIA 2020 | Project Website | BibTex This repository is for ScreenVAE introduced in the following paper "Manga F

A weakly-supervised scene graph generation codebase. The implementation of our CVPR2021 paper ``Linguistic Structures as Weak Supervision for Visual Scene Graph Generation''
A weakly-supervised scene graph generation codebase. The implementation of our CVPR2021 paper ``Linguistic Structures as Weak Supervision for Visual Scene Graph Generation''

README.md shall be finished soon. WSSGG 0 Overview 1 Installation 1.1 Faster-RCNN 1.2 Language Parser 1.3 GloVe Embeddings 2 Settings 2.1 VG-GT-Graph

improvement of CLIP features over the traditional resnet features on the visual question answering, image captioning, navigation and visual entailment tasks.

CLIP-ViL In our paper "How Much Can CLIP Benefit Vision-and-Language Tasks?", we show the improvement of CLIP features over the traditional resnet fea

The LaTeX and Python code for generating the paper, experiments' results and visualizations reported in each paper is available (whenever possible) in the paper's directory
The LaTeX and Python code for generating the paper, experiments' results and visualizations reported in each paper is available (whenever possible) in the paper's directory

This repository contains the software implementation of most algorithms used or developed in my research. The LaTeX and Python code for generating the

Hierarchical-Bayesian-Defense - Towards Adversarial Robustness of Bayesian Neural Network through Hierarchical Variational Inference (Openreview) Codes for NAACL 2021 Paper
Codes for NAACL 2021 Paper "Unsupervised Multi-hop Question Answering by Question Generation"

Unsupervised-Multi-hop-QA This repository contains code and models for the paper: Unsupervised Multi-hop Question Answering by Question Generation (NA

Source Code for our paper: Understand me, if you refer to Aspect Knowledge: Knowledge-aware Gated Recurrent Memory Network
Source Code for our paper: Understand me, if you refer to Aspect Knowledge: Knowledge-aware Gated Recurrent Memory Network

KaGRMN-DSG_ABSA This repository contains the PyTorch source Code for our paper: Understand me, if you refer to Aspect Knowledge: Knowledge-aware Gated

Owner
Yongrui Chen
Yongrui Chen
Open source code for Paper "A Co-Interactive Transformer for Joint Slot Filling and Intent Detection"

A Co-Interactive Transformer for Joint Slot Filling and Intent Detection This repository contains the PyTorch implementation of the paper: A Co-Intera

null 67 Dec 5, 2022
RNG-KBQA: Generation Augmented Iterative Ranking for Knowledge Base Question Answering

RNG-KBQA: Generation Augmented Iterative Ranking for Knowledge Base Question Answering Authors: Xi Ye, Semih Yavuz, Kazuma Hashimoto, Yingbo Zhou and

Salesforce 72 Dec 5, 2022
Code for ACL 21: Generating Query Focused Summaries from Query-Free Resources

marge This repository releases the code for Generating Query Focused Summaries from Query-Free Resources. Please cite the following paper [bib] if you

Yumo Xu 28 Nov 10, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021
Code for the paper "Query Embedding on Hyper-relational Knowledge Graphs"

Query Embedding on Hyper-Relational Knowledge Graphs This repository contains the code used for the experiments in the paper Query Embedding on Hyper-

DimitrisAlivas 19 Jul 26, 2022
[IJCAI-2021] A benchmark of data-free knowledge distillation from paper "Contrastive Model Inversion for Data-Free Knowledge Distillation"

DataFree A benchmark of data-free knowledge distillation from paper "Contrastive Model Inversion for Data-Free Knowledge Distillation" Authors: Gongfa

ZJU-VIPA 47 Jan 9, 2023
Using this codebase as a tool for my own research. Making some modifications to the original repo for my own purposes.

For SwapNet Create a list.txt file containing all the images to process. This can be done with the GNU find command: find path/to/input/folder -name '

Andrew Jong 2 Nov 10, 2021
SlotRefine: A Fast Non-Autoregressive Model forJoint Intent Detection and Slot Filling

SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Reference Main paper to be cited (Di Wu et al., 2020) @article

Moore 34 Nov 3, 2022
Intent parsing and slot filling in PyTorch with seq2seq + attention

PyTorch Seq2Seq Intent Parsing Reframing intent parsing as a human - machine translation task. Work in progress successor to torch-seq2seq-intent-pars

Sean Robertson 160 Jan 7, 2023
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling"

RNN-for-Joint-NLU Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling"

Kim SungDong 194 Dec 28, 2022