Joint Laboratory of HIT and iFLYTEK Research (HFL) commited on
Commit
7046f8c
1 Parent(s): 3e2c367

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - zh
4
+ tags:
5
+ - bert
6
+ license: "apache-2.0"
7
+ ---
8
+ ## Chinese BERT with Whole Word Masking
9
+ For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
10
+
11
+ **[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
12
+ Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
13
+
14
+ This repository is developed based on:https://github.com/google-research/bert
15
+
16
+ You may also interested in,
17
+ - Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
18
+ - Chinese MacBERT: https://github.com/ymcui/MacBERT
19
+ - Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
20
+ - Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
21
+ - Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
22
+
23
+ More resources by HFL: https://github.com/ymcui/HFL-Anthology
24
+
25
+ ## Citation
26
+ If you find the technical report or resource is useful, please cite the following technical report in your paper.
27
+ - Primary: https://arxiv.org/abs/2004.13922
28
+ ```
29
+ @inproceedings{cui-etal-2020-revisiting,
30
+ title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
31
+ author = "Cui, Yiming and
32
+ Che, Wanxiang and
33
+ Liu, Ting and
34
+ Qin, Bing and
35
+ Wang, Shijin and
36
+ Hu, Guoping",
37
+ booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
38
+ month = nov,
39
+ year = "2020",
40
+ address = "Online",
41
+ publisher = "Association for Computational Linguistics",
42
+ url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
43
+ pages = "657--668",
44
+ }
45
+ ```
46
+ - Secondary: https://arxiv.org/abs/1906.08101
47
+ ```
48
+ @article{chinese-bert-wwm,
49
+ title={Pre-Training with Whole Word Masking for Chinese BERT},
50
+ author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},
51
+ journal={arXiv preprint arXiv:1906.08101},
52
+ year={2019}
53
+ }
54
+ ```