Code for the paper: Sequence-to-Sequence Learning with Latent Neural Grammars

Overview

Sequence-to-Sequence Learning with Latent Neural Grammars

Code for the paper:
Sequence-to-Sequence Learning with Latent Neural Grammars
Yoon Kim
arXiv Preprint

Dependencies

The code was tested in python 3.7 and pytorch 1.5. We also use a slightly modified version of the Torch-Struct library, which is included in the repo and can be installed via:

cd pytorch-struct
python setup.py install

Data

For convenience we include the datasets used in the paper in the data/ folder. Please cite the original papers when using the data (i.e. Lake and Baroni 2018 for SCAN/MT, and Lyu et al. 2021 for StylePTB).

Training

SCAN

To train the model on (for example) the length split:

python train_scan.py --train_file data/SCAN/tasks_train_length.txt --save_path scan-length.pt

For prediction and evaluation:

python predict_scan.py --data_file data/SCAN/tasks_test_length.txt --model_path scan-length.pt

Style Transfer

To train on (for example) the active-to-passive task:

python train_styleptb.py --train_file data/StylePTB/ATP/train.tsv --dev_file data/StylePTB/ATP/valid.tsv --save_path styleptb-atp.pt

To predict:

python predict_styleptb.py --data_file data/StylePTB/ATP/test.tsv --model_path styleptb-atp.pt 
--out_file styleptb-atp-pred.txt

We use the nlg-eval package to calculate the various metrics.

Machine Translation

To train on MT:

python train_mt.py --train_file_src data/MT/train.en --train_file_tgt data/MT/train.fr 
--dev_file_src data/MT/dev.en --dev_file_tgt data/MT/dev.fr --save_path mt.pt

To predict on the daxy test set:

python predict_mt.py --data_file data/MT/test-daxy.en --model_path mt.pt --out_file mt-pred-daxy.txt

For the regular test set:

python predict_mt.py --data_file data/MT/test.en --model_path mt.pt --out_file mt-pred.txt

We use the multi-bleu script to calculate BLEU.

Training Stability

We observed training to be unstable and the approach required several runs across different seeds to perform well. For reference we have posted logs of some example runs in the logs/ folder.

License

MIT

You might also like...
Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.

Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.

Code for our paper "Transfer Learning for Sequence Generation: from Single-source to Multi-source" in ACL 2021.

TRICE: a task-agnostic transferring framework for multi-source sequence generation This is the source code of our work Transfer Learning for Sequence

A Multilingual Latent Dirichlet Allocation (LDA) Pipeline with Stop Words Removal, n-gram features, and Inverse Stemming, in Python.

Multilingual Latent Dirichlet Allocation (LDA) Pipeline This project is for text clustering using the Latent Dirichlet Allocation (LDA) algorithm. It

An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition
An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition

CRNN paper:An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition 1. create your ow

Code to use Augmented Shapiro Wilks Stopping, as well as code for the paper "Statistically Signifigant Stopping of Neural Network Training"

This codebase is being actively maintained, please create and issue if you have issues using it Basics All data files are included under losses and ea

This repository will contain the code for the CVPR 2021 paper
This repository will contain the code for the CVPR 2021 paper "GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields"

GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields Project Page | Paper | Supplementary | Video | Slides | Blog | Talk If

Code for our paper
Code for our paper "Mask-Align: Self-Supervised Neural Word Alignment" in ACL 2021

Mask-Align: Self-Supervised Neural Word Alignment This is the implementation of our work Mask-Align: Self-Supervised Neural Word Alignment. @inproceed

Code to reprudece NeurIPS paper: Accelerated Sparse Neural Training: A Provable and Efficient Method to Find N:M Transposable Masks

Accelerated Sparse Neural Training: A Provable and Efficient Method to FindN:M Transposable Masks Recently, researchers proposed pruning deep neural n

Yet Another Sequence Encoder - Encode sequences to vector of vector in python !

Yase Yet Another Sequence Encoder - encode sequences to vector of vectors in python ! Why Yase ? Yase enable you to encode any sequence which can be r

Owner
Yoon Kim
Yoon Kim
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet

Sockeye This package contains the Sockeye project, an open-source sequence-to-sequence framework for Neural Machine Translation based on Apache MXNet

Amazon Web Services - Labs 986 Feb 17, 2021
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet

Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet

Amazon Web Services - Labs 1000 Apr 19, 2021
A highly sophisticated sequence-to-sequence model for code generation

CoderX A proof-of-concept AI system by Graham Neubig (June 30, 2021). About CoderX CoderX is a retrieval-based code generation AI system reminiscent o

Graham Neubig 39 Aug 3, 2021
Sequence-to-Sequence learning using PyTorch

Seq2Seq in PyTorch This is a complete suite for training sequence-to-sequence models in PyTorch. It consists of several models and code to both train

Elad Hoffer 514 Nov 17, 2022
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language mod

null 20.5k Jan 8, 2023
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language mod

null 11.3k Feb 18, 2021
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language mod

null 13.2k Jul 7, 2021
Sequence-to-Sequence Framework in PyTorch

nmtpytorch allows training of various end-to-end neural architectures including but not limited to neural machine translation, image captioning and au

LIUM 395 Nov 21, 2022
Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction

This is a fork of Fairseq(-py) with implementations of the following models: Pervasive Attention - 2D Convolutional Neural Networks for Sequence-to-Se

Maha 490 Dec 15, 2022
MASS: Masked Sequence to Sequence Pre-training for Language Generation

MASS: Masked Sequence to Sequence Pre-training for Language Generation

Microsoft 1.1k Dec 17, 2022