|
--- |
|
license: llama2 |
|
--- |
|
|
|
# Lora finetuning on Wikipedia-10, applying counter factual data augmentation (CDA) |
|
- Dataset: Wikipedia-10 |
|
- Target modules = ["q_proj", "k_proj", "v_proj", "o_proj", "gate_proj", "up_proj", "down_proj"] |
|
|
|
``` |
|
{ |
|
"epoch": 2.566261678495529, |
|
"total_flos": 2.605973017460736e+18, |
|
"train_loss": 0.7921548962593079, |
|
"train_runtime": 150925.8962, |
|
"train_samples": 24939, |
|
"train_samples_per_second": 0.424, |
|
"train_steps_per_second": 0.013 |
|
} |
|
``` |
|
|
|
|
|
# Training script: https://github.com/ao9000/bias-bench/blob/main/experiments/run_clm.py |
|
|
|
|