Can you provide your base models in BF16 format, instead of FP16?

#29
by Vahid-Rakhshan - opened

Hi. Can you provide your base models in BF16 format, instead of FP16? It seems that the BF (Brain Float) format is much better than the FP format [1,2]. A base model in BF16 would ensures that the resulting GGUF conversion (which would probably be in BF16) has no data loss [1].

References:
[1] https://www.reddit.com/r/LocalLLaMA/comments/1fcjtpo/reflection_and_the_neverending_confusion_between/
[2] https://www.reddit.com/r/LocalLLaMA/comments/1axkwpf/gguf_16bit/

Sign up or log in to comment