4bit
/

Text Generation
Transformers
PyTorch
English
llama
causal-lm
text-generation-inference
stable-vicuna-13B-GPTQ / stable-vicuna-13B-GPTQ-4bit.compat.no-act-order.safetensors
camenduru's picture
thanks to TheBloke ❤
07c9a9a
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
442d71b56bc16721d28aeb2d5e0ba07cf04bfb61cc7af47993d5f0a15133b520
Pointer size:
135 Bytes
·
Size of remote file:
7.26 GB
·
Xet hash:
2c66551abd0b93ee97b3873830344246541694f7bbeb2a39bebf0df5302922a9

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.