RefineGNN - Iterative refinement graph neural network for antibody sequence-structure co-design (RefineGNN)

Overview

Iterative refinement graph neural network for antibody sequence-structure co-design (RefineGNN)

This is the implementation of our ICLR 2022 paper: https://arxiv.org/pdf/2110.04624.pdf

Warning: this repo is still under construction...

Language model and CDR structure prediction (Section 4.1)

Antibody structure data is retreived from the Structural Antibody Database (SAbDab). The training, validation, and test sets are provided in data/sabdab. Please decompress the files in that folder. To train a generative model for CDR-H3, please run

python ab_train.py --cdr_type 3 

Antigen-binding antibody design (Section 4.2)

antibody-antigen binding data is provided in data/rabd. To train a generative model, please run

python ab_train.py --train_path data/rabd/train.jsonl --val_path data/rabd/val.jsonl --test_path data/rabd/test.jsonl
You might also like...
Understanding and Improving Encoder Layer Fusion in Sequence-to-Sequence Learning (ICLR 2021)

Understanding and Improving Encoder Layer Fusion in Sequence-to-Sequence Learning (ICLR 2021) Citation Please cite as: @inproceedings{liu2020understan

[CVPR 2021] Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers
[CVPR 2021] Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers

[CVPR 2021] Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers

Sequence to Sequence Models with PyTorch
Sequence to Sequence Models with PyTorch

Sequence to Sequence models with PyTorch This repository contains implementations of Sequence to Sequence (Seq2Seq) models in PyTorch At present it ha

Sequence-to-Sequence learning using PyTorch

Seq2Seq in PyTorch This is a complete suite for training sequence-to-sequence models in PyTorch. It consists of several models and code to both train

Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction

This is a fork of Fairseq(-py) with implementations of the following models: Pervasive Attention - 2D Convolutional Neural Networks for Sequence-to-Se

Sequence lineage information extracted from RKI sequence data repo
Sequence lineage information extracted from RKI sequence data repo

Pango lineage information for German SARS-CoV-2 sequences This repository contains a join of the metadata and pango lineage tables of all German SARS-

Official repository of OFA. Paper: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Official repository of OFA. Paper: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework

Paper | Blog OFA is a unified multimodal pretrained model that unifies modalities (i.e., cross-modality, vision, language) and tasks (e.g., image gene

Code for paper
Code for paper "ASAP-Net: Attention and Structure Aware Point Cloud Sequence Segmentation"

ASAP-Net This project implements ASAP-Net of paper ASAP-Net: Attention and Structure Aware Point Cloud Sequence Segmentation (BMVC2020). Overview We i

An implementation for `Text2Event: Controllable Sequence-to-Structure Generation for End-to-end Event Extraction`

Text2Event An implementation for Text2Event: Controllable Sequence-to-Structure Generation for End-to-end Event Extraction Please contact Yaojie Lu (@

Comments
  • the CDR length n

    the CDR length n

    Thank you very much for such inspiring work. There is a small problem about CDR length n. The article mentioned in the pseudo code on page 6 that the length n of the CDR region was first predicted. However, I saw in the code that when facing section 4.2, RefineGNN used the real CDR length to generate the CDR-H3 sequence&structure for rabd. Is there anything I missed?or is it reasonable.

    opened by 897741007 0
  • do we need to train a mAb/antigen pair to redesign the CDRs?

    do we need to train a mAb/antigen pair to redesign the CDRs?

    We have a few cases of mAb/antigen pairs where we know there is or isn't binding, but we don't know the exact binding mechanism (epitope/paratope).

    If we want to increase binding or stability of a mAb/antigen pair with RefineGNN, starting with an Alphafold multimer pdb of the mAb Fv and the antigen (which may be wrong), how do we run RefineGNN to redesign the CDRs to increase the affinity/binding? Do we need to train our mAb/antigen pairs to redesign the CDRs?

    opened by avilella 0
  • Toward the biological meaning of generating CDR region with remaining sequence?

    Toward the biological meaning of generating CDR region with remaining sequence?

    Hi, thanks for the great work! I have a problem toward the biological meaning of the work. Since the CDR region is antigen-specific, how can we generate CDR sequence and structure with only remaining sequence give and without antigen-conditioned? I hope you can help me with my confusion.

    opened by YifanDengWHU 0
  • create a docker container for reproducible model inference

    create a docker container for reproducible model inference

    Hi Wengong,

    I read the RefineGNN paper today and would love to run it within a docker container. I created a dockerfile that I hope fulfills the installation requirements. One thing I noticed was a difficulty around installing torch==1.8.2 - the long term support version seems to be not available via pip install. Have you seen this problem, too? I simply used torch==1.9.0 in this build - It seems like one might get more lucky using a conda version of pytorch-lts if 1.8.2 is a strict requirement.

    How have you installing torch==1.8.2?

    opened by NiklasTR 0
Owner
Wengong Jin
Wengong Jin
Official repository for "PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation"

pair-emnlp2020 Official repository for the paper: Xinyu Hua and Lu Wang: PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long

Xinyu Hua 31 Oct 13, 2022
Official Implementation for "ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement" https://arxiv.org/abs/2104.02699

ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement Recently, the power of unconditional image synthesis has significantly advanced th

null 967 Jan 4, 2023
PyTorch Implementation of Google Brain's WaveGrad 2: Iterative Refinement for Text-to-Speech Synthesis

WaveGrad2 - PyTorch Implementation PyTorch Implementation of Google Brain's WaveGrad 2: Iterative Refinement for Text-to-Speech Synthesis. Status (202

Keon Lee 59 Dec 6, 2022
Unoffical implementation about Image Super-Resolution via Iterative Refinement by Pytorch

Image Super-Resolution via Iterative Refinement Paper | Project Brief This is a unoffical implementation about Image Super-Resolution via Iterative Re

LiangWei Jiang 2.5k Jan 2, 2023
An implementation of a sequence to sequence neural network using an encoder-decoder

Keras implementation of a sequence to sequence model for time series prediction using an encoder-decoder architecture. I created this post to share a

Luke Tonin 195 Dec 17, 2022
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

Invariant Point Attention - Pytorch Implementation of Invariant Point Attention as a standalone module, which was used in the structure module of Alph

Phil Wang 113 Jan 5, 2023
Code repo for "RBSRICNN: Raw Burst Super-Resolution through Iterative Convolutional Neural Network" (Machine Learning and the Physical Sciences workshop in NeurIPS 2021).

RBSRICNN: Raw Burst Super-Resolution through Iterative Convolutional Neural Network An official PyTorch implementation of the RBSRICNN network as desc

Rao Muhammad Umer 6 Nov 14, 2022
The source code of the paper "SHGNN: Structure-Aware Heterogeneous Graph Neural Network"

SHGNN: Structure-Aware Heterogeneous Graph Neural Network The source code and dataset of the paper: SHGNN: Structure-Aware Heterogeneous Graph Neural

Wentao Xu 7 Nov 13, 2022
Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers

Segmentation Transformer Implementation of Segmentation Transformer in PyTorch, a new model to achieve SOTA in semantic segmentation while using trans

Abhay Gupta 161 Dec 8, 2022
Implementation of SETR model, Original paper: Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers.

SETR - Pytorch Since the original paper (Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers.) has no official

zhaohu xing 112 Dec 16, 2022