suolyer commited on
Commit
73dd8a4
1 Parent(s): 2773647

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md CHANGED
@@ -1,3 +1,52 @@
1
  ---
 
 
2
  license: apache-2.0
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language:
3
+ - zh
4
  license: apache-2.0
5
+
6
+ tags:
7
+ - bert
8
+ - NLU
9
+ - NLI
10
+
11
+ inference: true
12
+
13
+ widget:
14
+ - text: "今天心情不好[SEP]今天很开心"
15
+
16
  ---
17
+ # Erlangshen-MegatronBert-1.3B-NLI, model (Chinese),one model of [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM).
18
+ We collect 4 NLI(Natural Language Inference) datasets in the Chinese domain for finetune, with a total of 1014787 samples. Our model is mainly based on [roberta](https://huggingface.co/hfl/chinese-roberta-wwm-ext)
19
+
20
+ ## Usage
21
+ ```python
22
+ from transformers import BertForSequenceClassification
23
+ from transformers import BertTokenizer
24
+ import torch
25
+
26
+ tokenizer=BertTokenizer.from_pretrained('IDEA-CCNL/Erlangshen-MegatronBert-1.3B-NLI')
27
+ model=BertForSequenceClassification.from_pretrained('IDEA-CCNL/Erlangshen-MegatronBert-1.3B-NLI')
28
+
29
+ texta='今天的饭不好吃'
30
+ textb='今天心情不好'
31
+
32
+ output=model(torch.tensor([tokenizer.encode(texta,textb)]))
33
+ print(torch.nn.functional.softmax(output.logits,dim=-1))
34
+
35
+ ```
36
+ ## Scores on downstream chinese tasks (without any data augmentation)
37
+ | Model | cmnli | ocnli | snli |
38
+ | :--------: | :-----: | :----: | :-----: |
39
+ | Erlangshen-Roberta-110M-NLI | 80.83 | 78.56 | 88.01 |
40
+ | Erlangshen-Roberta-330M-NLI | 82.25 | 79.82 | 88 |
41
+ | Erlangshen-MegatronBert-1.3B-NLI | 84.52 | 84.17 | 88.67 |
42
+
43
+ ## Citation
44
+ If you find the resource is useful, please cite the following website in your paper.
45
+ ```
46
+ @misc{Fengshenbang-LM,
47
+ title={Fengshenbang-LM},
48
+ author={IDEA-CCNL},
49
+ year={2021},
50
+ howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
51
+ }
52
+ ```