File size: 777 Bytes
12a82a1
7be05ca
 
12a82a1
7be05ca
 
 
 
 
12a82a1
 
7be05ca
12a82a1
7be05ca
12a82a1
7be05ca
 
 
 
12a82a1
7be05ca
12a82a1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
---
language:
- en
tags:
- text-classification
- zero-shot-classification
pipeline_tag: zero-shot-classification
library_name: transformers
license: mit
---

# Model description:  deberta-v3-large-mnli-fever-anli-ling-wanli-binary

This model was mostly created as a comparative benchmark for another model, see here: https://huggingface.co/MoritzLaurer/deberta-v3-large-zeroshot-v1.1-all-33

This model was only trained on five NLI datasets, while the other model was trained on many more datasets.
I mostly recommend using the other model. 
This NLI-only model might only be better for tasks that are not zeroshot classification 
but that adhere more strictly to the original NLI task. 

See the other model's model card for usage instructions, training data and the paper.