Run this on your Mac with Outlier — a free macOS app for local MLX inference.

Yi-Coder-9B-Chat (MLX 4-bit)

MLX 4-bit conversion of 01-ai/Yi-Coder-9B-Chat. License and base-model fields inherit from the original — see YAML frontmatter above.

Load with mlx-lm

pip install mlx-lm
python -m mlx_lm.generate --model Outlier-Ai/Yi-Coder-9B-Chat-MLX-4bit --prompt "Hello" --max-tokens 256

What is Outlier?

A free macOS app that runs MLX models locally — no cloud, no API keys, no usage caps.

outlier.host

Other Outlier conversions

License

Inherits from upstream (apache-2.0). See base model card.

Downloads last month
636
Safetensors
Model size
1B params
Tensor type
BF16
·
U32
·
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Outlier-Ai/Yi-Coder-9B-Chat-MLX-4bit

Quantized
(18)
this model

Collection including Outlier-Ai/Yi-Coder-9B-Chat-MLX-4bit