How to use from
Unsloth Studio
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh
# Run unsloth studio
unsloth studio -H 0.0.0.0 -p 8888
# Then open http://localhost:8888 in your browser
# Search for Pinkstack/PGAM-WIT-Conversational-3B-PyTorch to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex
# Run unsloth studio
unsloth studio -H 0.0.0.0 -p 8888
# Then open http://localhost:8888 in your browser
# Search for Pinkstack/PGAM-WIT-Conversational-3B-PyTorch to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required
# Open https://huggingface.co/spaces/unsloth/studio in your browser
# Search for Pinkstack/PGAM-WIT-Conversational-3B-PyTorch to start chatting
Load model with FastModel
pip install unsloth
from unsloth import FastModel
model, tokenizer = FastModel.from_pretrained(
    model_name="Pinkstack/PGAM-WIT-Conversational-3B-PyTorch",
    max_seq_length=2048,
)
Quick Links

This is a base/testing model. It is recommended to be used for further fine tuning or training.

This model is, odd. Been trained on both Grok and hf ultrachat_200k datasets, it acts oddly but is interesting to mess around with. WIT - weird & interesting transformer

Uploaded model

  • Developed by: Pinkstack
  • License: apache-2.0
  • Finetuned from model : Pinkstack/PGAM-WIT-Conversational-3B-vLLM (og version)

This model was trained with Unsloth and Huggingface's TRL library.

Downloads last month
14
Safetensors
Model size
3B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Pinkstack/PGAM-WIT-Conversational-3B-PyTorch

Unable to build the model tree, the base model loops to the model itself. Learn more.

Collection including Pinkstack/PGAM-WIT-Conversational-3B-PyTorch