| # SAM1-600M | |
| ## Chat Template | |
| ``` | |
| User: {{input}} | |
| Sam: {{output}} | |
| ``` | |
| ## Model Stats | |
| - Parameters: 348,357,632 (~348.4M) | |
| - Architecture: 24L × 1024d × 16H | |
| - Final Perplexity: 4.81 | |
| - Final Accuracy: 80.34% | |
| ## Usage | |
| ```python | |
| from transformers import AutoTokenizer, AutoModelForCausalLM | |
| tokenizer = AutoTokenizer.from_pretrained("YOUR_USERNAME/sam1-600m") | |
| model = AutoModelForCausalLM.from_pretrained("YOUR_USERNAME/sam1-600m") | |
| prompt = "User: Hello!\nSam:" | |
| inputs = tokenizer(prompt, return_tensors="pt") | |
| outputs = model.generate(**inputs, max_length=100) | |
| print(tokenizer.decode(outputs[0])) | |
| ``` | |