Protein GLM (wip)
Implementation of a protein autoregressive language model, but with autoregressive infilling objective (editing subsequences capability). It will also make use of a super-conditioning technique as outlined here.
The Transformers model will employ every state of the art improvement currently known.
Citations
@inproceedings{Du2021GLMGL,
title = {GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
author = {Zhengxiao Du and Yujie Qian and Xiao Liu and Ming Ding and Jiezhong Qiu and Zhilin Yang and Jie Tang},
year = {2021}
}