Transformer in Triton (wip)
Implementation of a Transformer, but completely in Triton. I'm completely new to lower-level neural net code, so this repository will mostly be a learning experience, with the end-goal being a vanilla transformer that is faster and more efficient to train.
Install
$ pip install triton-transformer
Usage
import torch
from triton_transformer import Transformer
model = Transformer(
num_tokens = 256,
max_seq_len = 1024,
dim = 512,
depth = 6,
heads = 8,
dim_head = 64
)
x = torch.randint(0, 256, (1, 1024))
mask = torch.ones(1, 1024).bool()
logits = model(x, mask = mask) # (1, 1024, 256)
Citations
@article{Tillet2019TritonAI,
title = {Triton: an intermediate language and compiler for tiled neural network computations},
author = {Philippe Tillet and H. Kung and D. Cox},
journal = {Proceedings of the 3rd ACM SIGPLAN International Workshop on Machine Learning and Programming Languages},
year = {2019}
}
@misc{vaswani2017attention,
title = {Attention Is All You Need},
author = {Ashish Vaswani and Noam Shazeer and Niki Parmar and Jakob Uszkoreit and Llion Jones and Aidan N. Gomez and Lukasz Kaiser and Illia Polosukhin},
year = {2017},
eprint = {1706.03762},
archivePrefix = {arXiv},
primaryClass = {cs.CL}
}