⚠️ Compatibility Warning ⚠️

This quantized model has known issues with vLLM versions > 0.9.2 due to architecture compatibility problems.

Use this model instead: gghfez/command-a-03-2025-AWQ

Issues:

  • Created before proper Cohere2 support in AWQ
  • Uses legacy "Cohere" architecture workaround
  • Breaks with newer vLLM versions

References:

Alternatives:

  • Command-A Reasoning AWQ (The reasoning version of this model), see these working quants: 4-bit | 8-bit by cpatonn
  • Command-A EXl3: ExLlamaV3 3.12bpw by Downtown-Case
Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for gghfez/c4ai-command-a-03-2025-AWQ

Quantized
(29)
this model