Yi-1.5
Collection
12 items • Updated • 2
How to use mlx-community/Yi-1.5-34B-Chat-4bit with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir Yi-1.5-34B-Chat-4bit mlx-community/Yi-1.5-34B-Chat-4bit
The Model mlx-community/Yi-1.5-34B-Chat-4bit was converted to MLX format from 01-ai/Yi-1.5-34B-Chat using mlx-lm version 0.13.0.
Model added by Prince Canuma.
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Yi-1.5-34B-Chat-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)
Quantized