Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=100_seed=123 LoRA model e96ed16 verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=500_seed=123 LoRA model f4c89b6 verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr5e-05_data_size1000_max_steps=500_seed=123 LoRA model 774d3be verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=500_seed=123 LoRA model 1c430f3 verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=500_seed=123 LoRA model a8c9ffb verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=100_seed=123 LoRA model b37720d verified mciccone commited on Jun 10, 2025
Add llama_finetune_qqp_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=100_seed=123 LoRA model 0d09f5d verified mciccone commited on Jun 10, 2025