Exllamav3 quantizations of Qwen/Qwen3-235B-A22B-Thinking-2507
2.10 bpw h6 59.287 GiB
2.80 bpw h6 78.295 GiB
3.60 bpw h6 100.116 GiB
4.25 bpw h6 117.803 GiB
- The 2.10 bpw quant will fit in three 24 GB cards with 45k of context.
- The 2.80 bpw quant will fit in four 24 GB cards with 57k of context.
- The 3.60 bpw quant will fit in five 24 GB cards with 57k of context.
- The 4.25 bpw quant will fit in six 24 GB cards with 73k of context.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for MikeRoz/Qwen3-235B-A22B-Thinking-2507-exl3
Base model
Qwen/Qwen3-235B-A22B-Thinking-2507