Joelzhang commited on
Commit
c95bf02
1 Parent(s): 20041e9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -23,7 +23,7 @@ widget:
23
 
24
  2021年登顶FewCLUE和ZeroCLUE的中文BERT,在数个改写任务微调后的版本
25
 
26
- This is the fine-tuned version of the Chinese BERT model on several semantic matching and similarity datasets, which topped FewCLUE and ZeroCLUE benchmark in 2021
27
 
28
  ## 模型分类 Model Taxonomy
29
 
@@ -32,10 +32,10 @@ This is the fine-tuned version of the Chinese BERT model on several semantic mat
32
  | 通用 General | 自然语言理解 NLU | 二郎神 Erlangshen | MegatronBert | 1.3B | Similarity |
33
 
34
  ## 模型信息 Model Information
35
- We collect 20 paraphrace datasets in the Chinese domain for finetune, with a total of 2773880 samples
36
  基于[Erlangshen-MegatronBert-1.3B](https://huggingface.co/IDEA-CCNL/Erlangshen-MegatronBert-1.3B),我们在收集的20个用于finetune的中文领域的改写数据集,总计227347个样本上微调了一个Similarity版本。
37
 
38
- Based on [Erlangshen-MegatronBert-1.3B] (https://huggingface.co/IDEA-CCNL/Erlangshen-MegatronBert-1.3B), we fine-tuned a similarity version on 8 Chinese paraphrace datasets, with totaling 227,347 samples.
39
 
40
  ### 下游效果 Performance
41
 
 
23
 
24
  2021年登顶FewCLUE和ZeroCLUE的中文BERT,在数个改写任务微调后的版本
25
 
26
+ This is the fine-tuned version of the Chinese BERT model on several paraphrase datasets, which topped FewCLUE and ZeroCLUE benchmark in 2021
27
 
28
  ## 模型分类 Model Taxonomy
29
 
 
32
  | 通用 General | 自然语言理解 NLU | 二郎神 Erlangshen | MegatronBert | 1.3B | Similarity |
33
 
34
  ## 模型信息 Model Information
35
+
36
  基于[Erlangshen-MegatronBert-1.3B](https://huggingface.co/IDEA-CCNL/Erlangshen-MegatronBert-1.3B),我们在收集的20个用于finetune的中文领域的改写数据集,总计227347个样本上微调了一个Similarity版本。
37
 
38
+ Based on [Erlangshen-MegatronBert-1.3B] (https://huggingface.co/IDEA-CCNL/Erlangshen-MegatronBert-1.3B), we fine-tuned a similarity version on 8 Chinese paraphrase datasets, with totaling 227,347 samples.
39
 
40
  ### 下游效果 Performance
41