Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=100_seed=123 LoRA model 1293485 verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=500_seed=123 LoRA model 8bd2c12 verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=500_seed=123 LoRA model e2188e1 verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=100_seed=123 LoRA model a21dadd verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=100_seed=123 LoRA model 598a7bc verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr5e-05_data_size1000_max_steps=500_seed=123 LoRA model 581607e verified mciccone commited on Jun 10
Add llama_finetune_MMLU-Pro_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=500_seed=123 LoRA model bc40097 verified mciccone commited on Jun 10