Migrate model card from transformers-repo
Browse filesRead announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/nghuyong/ernie-2.0-large-en/README.md
README.md
ADDED
@@ -0,0 +1,39 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# ERNIE-2.0-large
|
2 |
+
|
3 |
+
## Introduction
|
4 |
+
|
5 |
+
ERNIE 2.0 is a continual pre-training framework proposed by Baidu in 2019,
|
6 |
+
which builds and learns incrementally pre-training tasks through constant multi-task learning.
|
7 |
+
Experimental results demonstrate that ERNIE 2.0 outperforms BERT and XLNet on 16 tasks including English tasks on GLUE benchmarks and several common tasks in Chinese.
|
8 |
+
|
9 |
+
More detail: https://arxiv.org/abs/1907.12412
|
10 |
+
|
11 |
+
## Released Model Info
|
12 |
+
|
13 |
+
|Model Name|Language|Model Structure|
|
14 |
+
|:---:|:---:|:---:|
|
15 |
+
|ernie-2.0-large-en| English |Layer:24, Hidden:1024, Heads:16|
|
16 |
+
|
17 |
+
This released pytorch model is converted from the officially released PaddlePaddle ERNIE model and
|
18 |
+
a series of experiments have been conducted to check the accuracy of the conversion.
|
19 |
+
|
20 |
+
- Official PaddlePaddle ERNIE repo: https://github.com/PaddlePaddle/ERNIE
|
21 |
+
- Pytorch Conversion repo: https://github.com/nghuyong/ERNIE-Pytorch
|
22 |
+
|
23 |
+
## How to use
|
24 |
+
```Python
|
25 |
+
from transformers import AutoTokenizer, AutoModel
|
26 |
+
tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-2.0-large-en")
|
27 |
+
model = AutoModel.from_pretrained("nghuyong/ernie-2.0-large-en")
|
28 |
+
```
|
29 |
+
|
30 |
+
## Citation
|
31 |
+
|
32 |
+
```bibtex
|
33 |
+
@article{sun2019ernie20,
|
34 |
+
title={ERNIE 2.0: A Continual Pre-training Framework for Language Understanding},
|
35 |
+
author={Sun, Yu and Wang, Shuohuan and Li, Yukun and Feng, Shikun and Tian, Hao and Wu, Hua and Wang, Haifeng},
|
36 |
+
journal={arXiv preprint arXiv:1907.12412},
|
37 |
+
year={2019}
|
38 |
+
}
|
39 |
+
```
|