Quantized using the default exllamav3 (0.0.4) quantization process.
- Original model:  DreadPoor/Irix-12B-Model_Stock- refer for more details on the model.
- exllamav3: https://github.com/turboderp-org/exllamav3
EXL3 quants available:
- 4bpw, 5bpw
- Go to "Files and versions", then click on "Main" to choose your quant
	Inference Providers
	NEW
	
	
	This model isn't deployed by any Inference Provider.
	๐
			
		Ask for provider support
Model tree for s1arsky/Irix-12B-Model_Stock-EXL3
Base model
DreadPoor/Irix-12B-Model_Stock