this is only a very quick sft (few minutes in google colab) on a very tiny (60 steps) subset of cstr/capybara_de_sharegpt with unsloth, just as a proof of concept, that you can do this, make the model output in (not very good) german, and prompt with chatml template

Downloads last month
1
GGUF
Model size
4B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support