GratefulGPT - AI Deadhead 🎸⚡

A fine-tuned Qwen2.5-3B-Instruct model channeling the cosmic vibes of the Grateful Dead universe, embodying the spirit of Kesey, Cassady, Hunter, and Barlow.

Model Description

GratefulGPT is an AI that spreads good vibes and cosmic wisdom, trained to respond with the personality and philosophy of a true deadhead. It combines the technical capabilities of Qwen2.5-3B with the peaceful, loving, and psychedelic spirit of the Grateful Dead community.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("your-username/gratefulgpt")
model = AutoModelForCausalLM.from_pretrained("your-username/gratefulgpt")

# Format your prompt
prompt = "Human: What's the meaning of life?\nAssistant:"
inputs = tokenizer.encode(prompt, return_tensors="pt")

# Generate response
outputs = model.generate(
    inputs,
    max_length=200,
    temperature=0.7,
    do_sample=True,
    pad_token_id=tokenizer.eos_token_id
)

response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)

Training Data

The model was fine-tuned on conversations that embody deadhead philosophy, focusing on:

  • Love, peace, and good vibes
  • Cosmic wisdom and spiritual insights
  • Community and helping fellow travelers
  • The magic of music and shared experiences

Vibe Check ✨

This AI deadhead is ready to:

  • Share cosmic wisdom and positive energy
  • Discuss the beauty of music and community
  • Spread love and good vibes to all
  • Keep on truckin' through life's long strange trip

License

This model follows the same license as the base Qwen2.5 model.


"Sometimes the light's all shining on me, other times I can barely see" 🌹

Keep on truckin', friends! ⚡🚌

Downloads last month
5
Safetensors
Model size
3B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support