devolvedLLM / mixtral2 /all_results.json
JayKimDevolved's picture
Upload 66 files
f3ccbc5 verified
raw
history blame contribute delete
223 Bytes
{
"epoch": 79.43793911007026,
"total_flos": 1.1806200422774342e+19,
"train_loss": 0.4446117949373317,
"train_runtime": 54362.4918,
"train_samples_per_second": 6.279,
"train_steps_per_second": 0.078
}