distilbert-mnli / README.md
kweinmeister's picture
Update README.md
d60d802 verified
|
raw
history blame
1.07 kB
metadata
base_model: distilbert/distilbert-base-uncased
datasets:
  - nyu-mll/multi_nli
language: en
library_name: transformers
license: apache-2.0
metrics:
  - accuracy
pipeline_tag: sentence-similarity
datasets_description:
  - MNLI
model-index:
  - name: distilbert-base-uncased-mnli
    results:
      - task:
          type: natural-language-inference
        dataset:
          name: nyu-mll/multi_nli
          type: nli
          split: validation_matched
        metrics:
          - type: accuracy
            value: 0.8203

Model Card for distilbert-base-uncased

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the nyu-mll/multi_nli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.