|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- xnli |
|
- multi_nli |
|
language: |
|
- en |
|
pipeline_tag: zero-shot-classification |
|
tags: |
|
- roberta |
|
--- |
|
# DistilRoBERTa for NLI |
|
|
|
## Model description |
|
This model can be used for Natural Language Inference (NLI) tasks. |
|
It is a version of [roberta-base](https://huggingface.co/distilroberta-base) fine-tuned on [multi_nli](https://huggingface.co/datasets/multi_nli) |
|
and english [xnli](https://huggingface.co/datasets/xnli). |
|
|
|
## Model Performance |
|
The model's performance on NLI tasks is as follows: |
|
|
|
- Accuracy on MNLI validation matched: TODO |
|
- Accuracy on MNLI validation mismatched: TODO |