Fine-tunes
					Collection
				
				6 items
				β’ 
				Updated
					
				β’
					
					2
An initial foray into the world of fine-tuning. The goal of this release was to amplify the quality of the original model's responses, in particular for vision use cases*
Weighted (Importance Matrix) Quants available here
Static (Legacy) quants available here
*Requires additional mmproj file. You have two options for vision functionality (available inside this repo):
Select the gguf file of your choice in Koboldcpp as usual, then make sure to choose the mmproj file above in the LLaVA mmproj field of the model submenu:

Detailed results can be found here
| Metric | Value | 
|---|---|
| Avg. | 73.84 | 
| AI2 Reasoning Challenge (25-Shot) | 70.90 | 
| HellaSwag (10-Shot) | 87.93 | 
| MMLU (5-Shot) | 65.46 | 
| TruthfulQA (0-shot) | 70.82 | 
| Winogrande (5-shot) | 82.48 | 
| GSM8k (5-shot) | 65.43 | 
Base model
InferenceIllusionist/Excalibur-7b