1 Star 0 Fork 0

1991wangliang / fairseq

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
README.md 789 Bytes
一键复制 编辑 原始数据 按行查看 历史

Linformer: Self-Attention with Linear Complexity (Wang et al., 2020)

This example contains code to train Linformer models as described in our paper Linformer: Self-Attention with Linear Complexity.

Training a new Linformer RoBERTa model

You can mostly follow the RoBERTa pretraining README, updating your training command with --user-dir examples/linformer/linformer_src --arch linformer_roberta_base.

Citation

If you use our work, please cite:

@article{wang2020linformer,
  title={Linformer: Self-Attention with Linear Complexity},
  author={Wang, Sinong and Li, Belinda and Khabsa, Madian and Fang, Han and Ma, Hao},
  journal={arXiv preprint arXiv:2006.04768},
  year={2020}
}
1
https://gitee.com/wangliang1991/fairseq.git
git@gitee.com:wangliang1991/fairseq.git
wangliang1991
fairseq
fairseq
master

搜索帮助