Run this on your Mac with Outlier — a one-click app that loads MLX models locally. macOS arm64, free download.
DeepSeek-R1-Distill-Llama-8B (MLX 4-bit)
MLX 4-bit conversion of deepseek-ai/DeepSeek-R1-Distill-Llama-8B, repackaged for Apple Silicon. Original weights, original license — see frontmatter above. This repo only changes the on-disk format (safetensors, MLX 4-bit, chat_template.jinja, tokenizer).
About this conversion
- Format: MLX 4-bit safetensors (group size 64, symmetric)
- Tooling:
mlx-lm0.31.x compatible - Files:
model.safetensorsshards ·config.json· tokenizer ·chat_template.jinja - License: inherits from the upstream base model — see YAML
licensefield
Load directly with mlx-lm
pip install mlx-lm
python -m mlx_lm.generate \
--model Outlier-Ai/DeepSeek-R1-Distill-Llama-8B-MLX-4bit \
--prompt "Hello, world." \
--max-tokens 256
Or in Python:
from mlx_lm import load, generate
model, tokenizer = load("Outlier-Ai/DeepSeek-R1-Distill-Llama-8B-MLX-4bit")
print(generate(model, tokenizer, prompt="Hello, world.", max_tokens=256))
What is Outlier?
Outlier is a free macOS app that runs language models on your Mac, fully offline. Pick a model from a tier picker, click download, and chat — no API keys, no cloud round-trips, no usage caps. It ships with its own curated tier of MLX-4bit models and can also load any compatible MLX conversion (including this one) via the model picker.
➡ Download Outlier (free, Apple Silicon): outlier.host
For benchmark numbers (MMLU, HumanEval, tok/s on M-series Macs) with full provenance, see outlier.host/benchmarks.
Other Outlier conversions
- DeepSeek-R1-Distill-Qwen-7B (MLX 4-bit) — MLX 4-bit conversion (1,932 downloads)
- DeepSeek-R1-Distill-Qwen-32B (MLX 4-bit) — MLX 4-bit conversion (1,353 downloads)
- DeepSeek-R1-Distill-Qwen-14B (MLX 4-bit) — MLX 4-bit conversion (1,087 downloads)
- Outlier-Core-27B (MLX 4-bit) — MLX 4-bit conversion (55 downloads)
- Outlier-Nano-4B (MLX 4-bit) — MLX 4-bit conversion (67 downloads)
License
This conversion preserves the upstream license declared in the frontmatter (mit). Refer to the upstream base model card for the canonical license text and any usage restrictions.
- Downloads last month
- 1,148
4-bit
Model tree for Outlier-Ai/DeepSeek-R1-Distill-Llama-8B-MLX-4bit
Base model
deepseek-ai/DeepSeek-R1-Distill-Llama-8B