Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=500_seed=123 LoRA model ff019e2 verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr5e-05_data_size1000_max_steps=500_seed=123 LoRA model 2da2ebc verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=100_seed=123 LoRA model bb8daa8 verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=100_seed=123 LoRA model fc7e9c7 verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=500_seed=123 LoRA model 566e06e verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=100_seed=123 LoRA model 523475a verified mciccone commited on Jun 10, 2025
Add llama_finetune_boolean_expressions_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=500_seed=123 LoRA model 029f427 verified mciccone commited on Jun 10, 2025