SVG Code Generator

This is a fine-tuned model for generating SVG code from natural language descriptions. The model has been merged with the base model weights and optimized in fp16 format.

Model Details

  • Model Name: model_v15
  • Base Model: qwen3-0.6B
  • Training Method: Fine-tuning with merged weights
  • Task: Text-to-SVG code generation
  • Model Type: Merged Qwen model
  • Precision: fp16
  • Library: Transformers, vLLM compatible
  • Format: Merged model (not adapter-based)

Usage

With Transformers

Load the model directly using the transformers library:

# Load base model and tokenizer
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("vinoku89/svg-code-generator")
model = AutoModelForCausalLM.from_pretrained("vinoku89/svg-code-generator")


# Generate SVG code
prompt = "Create a blue circle with radius 50"
inputs = tokenizer(prompt, return_tensors="pt")

# Generate with parameters
outputs = model.generate(
    **inputs, 
    max_length=200,
    temperature=0.7,
    do_sample=True,
    pad_token_id=tokenizer.eos_token_id
)

# Decode the generated SVG code
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
svg_code = generated_text[len(prompt):].strip()

print("Generated SVG:")
print(svg_code)

With vLLM

This model supports vLLM for high-performance inference in fp16 format.

Training Data

The model was trained on SVG code generation tasks with natural language descriptions.

Intended Use

This model is designed to generate SVG code from text descriptions for educational and creative purposes.

Limitations

  • Generated SVG may require validation
  • Performance depends on prompt clarity
  • Limited to SVG syntax and features seen during training

Model Performance

The model has been fine-tuned specifically for SVG generation tasks with merged weights for optimal performance.

Technical Details

  • Precision: fp16 for memory efficiency
  • Compatibility: vLLM supported for high-throughput inference
  • Architecture: Merged fine-tuned weights (no adapters required)
Downloads last month
23
Safetensors
Model size
596M params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support