Subhabrata Mukherjee commited on
Commit
8468742
1 Parent(s): fbedb81

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -22
README.md CHANGED
@@ -32,28 +32,13 @@ Tested with `tensorflow 2.3.1, transformers 4.1.1, torch 1.6.0`
32
  If you use this checkpoint in your work, please cite:
33
 
34
  ``` latex
35
- @inproceedings{mukherjee-hassan-awadallah-2020-xtremedistil,
36
- title = "{X}treme{D}istil: Multi-stage Distillation for Massive Multilingual Models",
37
- author = "Mukherjee, Subhabrata and
38
- Hassan Awadallah, Ahmed",
39
- booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
40
- month = jul,
41
- year = "2020",
42
- address = "Online",
43
- publisher = "Association for Computational Linguistics",
44
- url = "https://www.aclweb.org/anthology/2020.acl-main.202",
45
- doi = "10.18653/v1/2020.acl-main.202",
46
- pages = "2221--2234",
47
  }
48
  ```
49
 
50
- ``` latex
51
- @misc{wang2020minilm,
52
- title={MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers},
53
- author={Wenhui Wang and Furu Wei and Li Dong and Hangbo Bao and Nan Yang and Ming Zhou},
54
- year={2020},
55
- eprint={2002.10957},
56
- archivePrefix={arXiv},
57
- primaryClass={cs.CL}
58
- }
59
- ```
 
32
  If you use this checkpoint in your work, please cite:
33
 
34
  ``` latex
35
+ @misc{mukherjee2021xtremedistiltransformers,
36
+ title={XtremeDistilTransformers: Task Transfer for Task-agnostic Distillation},
37
+ author={Subhabrata Mukherjee and Ahmed Hassan Awadallah and Jianfeng Gao},
38
+ year={2021},
39
+ eprint={2106.04563},
40
+ archivePrefix={arXiv},
41
+ primaryClass={cs.CL}
 
 
 
 
 
42
  }
43
  ```
44