File size: 1,037 Bytes
f97122a 951395a f97122a 2444d8a f97122a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
language:
- zh
license: apache-2.0
widget:
- text: "北京是[MASK]国的首都。"
---
# Mengzi-BERT base model (Chinese)
Pretrained model on 300G Chinese corpus. Masked language modeling(MLM), part-of-speech(POS) tagging and sentence order prediction(SOP) are used as training task.
[Mengzi: A lightweight yet Powerful Chinese Pre-trained Language Model](www.example.com)
## Usage
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained("Langboat/mengzi-bert-base")
model = BertModel.from_pretrained("Langboat/mengzi-bert-base")
```
## Scores on nine chinese tasks (without any data augmentation)
|Model|AFQMC|TNEWS|IFLYTEK|CMNLI|WSC|CSL|CMRC|C3|CHID|
|-|-|-|-|-|-|-|-|-|-|
|CLUE RoBERTa-wwm-ext Baseline|74.04|56.94|60.31|80.51|67.80|81.00|75.20|66.50|83.62|
|Mengzi-BERT-base|74.58|57.97|60.68|82.12|87.50|85.40|78.54|71.70|84.16|
## Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
```
example
``` |