⚠️ Compatibility Warning ⚠️
This quantized model has known issues with vLLM versions > 0.9.2 due to architecture compatibility problems.
Use this model instead: gghfez/command-a-03-2025-AWQ
Issues:
- Created before proper Cohere2 support in AWQ
- Uses legacy "Cohere" architecture workaround
- Breaks with newer vLLM versions
References:
Alternatives:
- Command-A Reasoning AWQ (The reasoning version of this model), see these working quants: 4-bit | 8-bit by cpatonn
- Command-A EXl3: ExLlamaV3 3.12bpw by Downtown-Case
- Downloads last month
- 15
Model tree for gghfez/c4ai-command-a-03-2025-AWQ
Base model
CohereLabs/c4ai-command-a-03-2025