These are all the intermediate tensors that are created by llama.cpp in a single forward pass through Qwen3-0.6B.

ref: https://github.com/ddh0/llama.cpp/tree/save-tensor-npy-feature

./build/bin/llama-eval-callback -m ~/gguf/Qwen3-0.6B-f32.gguf -p "The quick brown fox" -n 1 -v

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support