llama-3-8b-dpo-full / train_results.json
hxssgaa's picture
Model save
1730d66 verified
raw
history blame
233 Bytes
{
"epoch": 0.9969104016477858,
"total_flos": 0.0,
"train_loss": 0.6619468080110786,
"train_runtime": 1288.9507,
"train_samples": 62135,
"train_samples_per_second": 48.206,
"train_steps_per_second": 0.094
}