distilroberta-nli / README.md
matekadlicsko's picture
Update README.md
d141415
|
raw
history blame contribute delete
No virus
610 Bytes
metadata
license: apache-2.0
datasets:
  - xnli
  - multi_nli
language:
  - en
pipeline_tag: zero-shot-classification
tags:
  - roberta

DistilRoBERTa for NLI

Model description

This model can be used for Natural Language Inference (NLI) tasks. It is a version of roberta-base fine-tuned on multi_nli and english xnli.

Model Performance

The model's performance on NLI tasks is as follows:

  • Accuracy on MNLI validation matched: TODO
  • Accuracy on MNLI validation mismatched: TODO