ymcui
commited on
Commit
•
703ed80
1
Parent(s):
982a35a
update info
Browse files
README.md
CHANGED
@@ -5,6 +5,8 @@ tags:
|
|
5 |
- bert
|
6 |
license: "apache-2.0"
|
7 |
---
|
|
|
|
|
8 |
## Chinese BERT with Whole Word Masking
|
9 |
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
10 |
|
@@ -51,4 +53,4 @@ If you find the technical report or resource is useful, please cite the followin
|
|
51 |
journal={arXiv preprint arXiv:1906.08101},
|
52 |
year={2019}
|
53 |
}
|
54 |
-
```
|
|
|
5 |
- bert
|
6 |
license: "apache-2.0"
|
7 |
---
|
8 |
+
# This is a re-trained 6-layer RoBERTa-wwm-ext model.
|
9 |
+
|
10 |
## Chinese BERT with Whole Word Masking
|
11 |
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
12 |
|
|
|
53 |
journal={arXiv preprint arXiv:1906.08101},
|
54 |
year={2019}
|
55 |
}
|
56 |
+
```
|