paddlenlp
PaddlePaddle
English
ernie
sijunhe commited on
Commit
83e3d3d
1 Parent(s): 317fefa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -1
README.md CHANGED
@@ -1,4 +1,40 @@
1
  ---
2
  library_name: paddlenlp
 
 
 
3
  ---
4
- # PaddlePaddle/ernie-2.0-base-en
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  library_name: paddlenlp
3
+ license: apache-2.0
4
+ language:
5
+ - zh
6
  ---
7
+ # PaddlePaddle/ernie-2.0-base-en
8
+
9
+ ## Introduction
10
+
11
+ Recently, pre-trained models have achieved state-of-the-art results in various language understanding tasks, which indicates that pre-training on large-scale corpora may play a crucial role in natural language processing.
12
+ Current pre-training procedures usually focus on training the model with several simple tasks to grasp the co-occurrence of words or sentences. However, besides co-occurring,
13
+ there exists other valuable lexical, syntactic and semantic information in training corpora, such as named entity, semantic closeness and discourse relations.
14
+ In order to extract to the fullest extent, the lexical, syntactic and semantic information from training corpora, we propose a continual pre-training framework named ERNIE 2.0
15
+ which builds and learns incrementally pre-training tasks through constant multi-task learning.
16
+ Experimental results demonstrate that ERNIE 2.0 outperforms BERT and XLNet on 16 tasks including English tasks on GLUE benchmarks and several common tasks in Chinese.
17
+
18
+ More detail: https://arxiv.org/abs/1907.12412
19
+
20
+ ## Available Models
21
+
22
+ - ernie-2.0-base-en
23
+ - ernie-2.0-large-en
24
+ - ernie-2.0-base-zh
25
+ - ernie-2.0-large-zh
26
+
27
+ ## How to Use?
28
+
29
+ Click on the *Use in paddlenlp* button on the top right!
30
+
31
+ ## Citation Info
32
+
33
+ ```text
34
+ @article{ernie2.0,
35
+ title = {ERNIE 2.0: A Continual Pre-training Framework for Language Understanding},
36
+ author = {Sun, Yu and Wang, Shuohuan and Li, Yukun and Feng, Shikun and Tian, Hao and Wu, Hua and Wang, Haifeng},
37
+ journal={arXiv preprint arXiv:1907.12412},
38
+ year = {2019},
39
+ }
40
+ ```