|
--- |
|
base_model: distilbert/distilbert-base-uncased |
|
datasets: |
|
- nyu-mll/multi_nli |
|
language: en |
|
library_name: transformers |
|
license: apache-2.0 |
|
metrics: |
|
- accuracy |
|
pipeline_tag: sentence-similarity |
|
datasets_description: |
|
- MNLI |
|
model-index: |
|
- name: distilbert-base-uncased-mnli |
|
results: |
|
- task: |
|
type: natural-language-inference |
|
dataset: |
|
name: nyu-mll/multi_nli |
|
type: nli |
|
split: validation_matched |
|
metrics: |
|
- type: accuracy |
|
value: 0.8203 |
|
--- |
|
|
|
# Model Card for distilbert-base-uncased |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
A fine-tuned version of `distilbert/distilbert-base-uncased` using the `nyu-mll/multi_nli` dataset. |
|
|
|
- **Developed by:** Karl Weinmeister |
|
- **Language(s) (NLP):** en |
|
- **License:** apache-2.0 |
|
- **Finetuned from model [optional]:** distilbert/distilbert-base-uncased |
|
|
|
|
|
|
|
#### Training Hyperparameters |
|
|
|
- **Training regime:** The model was trained for 5 epochs with batch size 128. |
|
|