Lora finetuning on Wikipedia-10, applying increased dropout for all layers
- Dataset: Wikipedia-10
- Target modules = ["q_proj", "k_proj", "v_proj", "o_proj", "gate_proj", "up_proj", "down_proj"]
{
"epoch": 0.1857053320643469,
"total_flos": 2.605973017460736e+18,
"train_loss": 1.8399735396504402,
"train_runtime": 135360.4815,
"train_samples": 344632,
"train_samples_per_second": 0.473,
"train_steps_per_second": 0.015
}
Training script: https://github.com/ao9000/bias-bench/blob/main/experiments/run_clm.py
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.