cosFormer
Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention
Update log
- 2022/2/28
- Add core code
License
This repository is released under the Apache 2.0 license as found in the LICENSE file.
Citation
If you use this code for a paper, please cite:
@inproceedings{
zhen2022cosformer,
title={cosFormer: Rethinking Softmax In Attention},
author={Zhen Qin and Weixuan Sun and Hui Deng and Dongxu Li and Yunshen Wei and Baohong Lv and Junjie Yan and Lingpeng Kong and Yiran Zhong},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=Bl8CQrx2Up4}
}