YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
warmup_steps = 5,
num_train_epochs = 3,
learning_rate = 5e-5,

optim="galore_adafactor",
optim_target_modules=[r".*.attn.*", r".*.mlp.*"],

weight_decay = 0.03, #L2 reg
lr_scheduler_type = "linear",  #reduce_lr_on_plateau
gradient_accumulation_steps = 4,
use_liger = True,
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including moneco/Llama8B-1k-3-epoch_3