Instructions to use trl-lib/llama-7b-se-rm-peft with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use trl-lib/llama-7b-se-rm-peft with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("trl-lib/llama-7b-se-rm-peft", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 90863eee7183f927b51ebac9ab99f296395f14cfc99209b1c5c223e2ce60d6af
- Size of remote file:
- 16.8 MB
- SHA256:
- 4fc6df49f3d18888b83fa648801f28e6c54d39cea5b625d272c807ff852d5a59
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.