#test this from transformers import AutoTokenizer

from peft import PeftModel from transformers import AutoModelForCausalLM from unsloth import FastLanguageModel

base_model = FastLanguageModel.from_pretrained("FreedomIntelligence/AceGPT-7B")

model = PeftModel.from_pretrained(base_model, "hamywaleed/tabib_beetlware_v1")

tokenizer = AutoTokenizer.from_pretrained("FreedomIntelligence/AceGPT-7B")

Downloads last month
10
Safetensors
Model size
3.64B params
Tensor type
F32
ยท
F16
ยท
U8
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for beetleware/beetelware-tabep

Adapter
(3)
this model