llama3-8b-ultrachat-sft-itt / train_results.json
kykim0's picture
Model save
b2f2e5b verified
raw
history blame
248 Bytes
{
"epoch": 1.9989310529128808,
"total_flos": 783053739786240.0,
"train_loss": 1.0084666737260666,
"train_runtime": 48199.6175,
"train_samples": 207864,
"train_samples_per_second": 4.968,
"train_steps_per_second": 0.039
}