uer commited on
Commit
eb2fa81
1 Parent(s): afd26bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -19,7 +19,7 @@ widget:
19
 
20
  ## Model description
21
 
22
- This is the sentence embedding model pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658).
23
 
24
  ## How to use
25
 
@@ -68,6 +68,7 @@ python3 scripts/convert_sbert_from_uer_to_huggingface.py --input_model_path mode
68
  journal={arXiv preprint arXiv:1908.10084},
69
  year={2019}
70
  }
 
71
  @article{zhao2019uer,
72
  title={UER: An Open-Source Toolkit for Pre-training Models},
73
  author={Zhao, Zhe and Chen, Hui and Zhang, Jinbin and Zhao, Xin and Liu, Tao and Lu, Wei and Chen, Xi and Deng, Haotang and Ju, Qi and Du, Xiaoyong},
@@ -75,4 +76,11 @@ python3 scripts/convert_sbert_from_uer_to_huggingface.py --input_model_path mode
75
  pages={241},
76
  year={2019}
77
  }
 
 
 
 
 
 
 
78
  ```
 
19
 
20
  ## Model description
21
 
22
+ This is the sentence embedding model pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the model could also be pre-trained by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/abs/2212.06385), which inherits UER-py to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
23
 
24
  ## How to use
25
 
 
68
  journal={arXiv preprint arXiv:1908.10084},
69
  year={2019}
70
  }
71
+
72
  @article{zhao2019uer,
73
  title={UER: An Open-Source Toolkit for Pre-training Models},
74
  author={Zhao, Zhe and Chen, Hui and Zhang, Jinbin and Zhao, Xin and Liu, Tao and Lu, Wei and Chen, Xi and Deng, Haotang and Ju, Qi and Du, Xiaoyong},
 
76
  pages={241},
77
  year={2019}
78
  }
79
+
80
+ @article{zhao2023tencentpretrain,
81
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
82
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
83
+ journal={ACL 2023},
84
+ pages={217},
85
+ year={2023}
86
  ```