|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- glue |
|
metrics: |
|
- matthews_correlation |
|
model-index: |
|
- name: distilbert-base-uncased-finetuned-FFT-CoLA |
|
results: |
|
- task: |
|
name: Text Classification |
|
type: text-classification |
|
dataset: |
|
name: glue |
|
type: glue |
|
args: cola |
|
metrics: |
|
- name: Matthews Correlation |
|
type: matthews_correlation |
|
value: 0.5049093009936784 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# distilbert-base-uncased-finetuned-lora-cola |
|
|
|
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on the glue dataset. |
|
It achieves the following results on the evaluation set: |
|
- Matthews Correlation: 0.5049 |
|
- trainable model parameters: 1181954 |
|
- all model parameters: 68136964 |
|
- percentage of trainable model parameters: 1.73% |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-04 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- weight_decay: 0.01 |
|
- rank: 32 |
|
- lora_alpha: 16 |
|
- lora_dropout: 0.05 |
|
- num_epochs: 5 |