File size: 1,871 Bytes
52978a1
 
e16fd71
 
 
52978a1
 
 
 
 
 
 
 
 
 
bfb08ac
 
52978a1
 
 
 
 
1bd59c9
52978a1
 
 
 
bfb08ac
f7c0d71
 
52978a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
language: en
tags: 
  - deberta
  - deberta-v3
thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
license: mit
---

## DeBERTa: Decoding-enhanced BERT with Disentangled Attention

[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. With those two improvements, DeBERTa out perform RoBERTa on a majority of NLU tasks with 80GB training data. 

Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.

In DeBERTa V3 we replaced MLM objective with RTD(Replaced Token Detection) objective during pre-training, which significantly improves the model performance. Please check appendix A11 in our paper [DeBERTa](https://arxiv.org/abs/2006.03654) for more details.

This is the DeBERTa V3 small model with 6 layers, 768 hidden size. Total parameters is 143M while Embedding layer take about 98M due to the usage of 128k vocabulary. It's trained with 160GB data.

#### Fine-tuning on NLU tasks

We present the dev results on SQuAD 1.1/2.0 and MNLI tasks.

| Model             | SQuAD 1.1 | SQuAD 2.0 | MNLI-m |
|-------------------|-----------|-----------|--------|
| RoBERTa-base      | 91.5/84.6 | 83.7/80.5 | 87.6   |
| XLNet-base        | -/-       | -/80.2    | 86.8   |
|DeBERTa-base	|93.1/87.2|	86.2/83.1|	88.8|
| **DeBERTa-v3-small**  | -/- | -/- | 88.2   |
| DeBERTa-v3-small+SiFT  | -/- | -/- | 88.8   |


### Citation

If you find DeBERTa useful for your work, please cite the following paper:

``` latex
@inproceedings{
he2021deberta,
title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION},
author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=XPZIaotutsD}
}
```