Access Gemma on Hugging Face

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

To access Gemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged in to Hugging Face and click below. Requests are processed immediately.

Log in or Sign Up to review the conditions and access this model content.

Model Card for gemma-3-svg-generator

This model is under development, results may not be good

This model is a fine-tuned version of google/gemma-3-1b-it. It has been trained using TRL.

Quick start

from transformers import pipeline

question = "Generate code for a SVG (Scalable Vector Graphics) of a cat"
generator = pipeline("text-generation", model="shorecode/gemma-3-svg-generator", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=512, return_full_text=False)[0]
print(output["generated_text"])

Training procedure

This model was trained with SFT.

Framework versions

  • TRL: 0.25.0
  • Transformers: 4.57.1
  • Pytorch: 2.8.0+cu126
  • Datasets: 4.0.0
  • Tokenizers: 0.22.1

Citations

Cite TRL as:

@misc{vonwerra2022trl,
    title        = {{TRL: Transformer Reinforcement Learning}},
    author       = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
    year         = 2020,
    journal      = {GitHub repository},
    publisher    = {GitHub},
    howpublished = {\url{https://github.com/huggingface/trl}}
}
Downloads last month
147
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for shorecode/gemma-3-svg-generator-lora

Finetuned
(340)
this model