Inference with transformer failed

#1
by Je4nMarc - opened

I try to infer this model with the transformer library:

The tokenizer instanciation failed with

\lib\site-packages\transformers\models\llama\tokenization_llama.py", line 201, in get_spm_processor
    with open(self.vocab_file, "rb") as f:
TypeError: expected str, bytes or os.PathLike object, not NoneType

the code I use :

tokenizer = AutoTokenizer.from_pretrained(repo_id_origin,cache_dir="./model",trust_remote_code=True,torch_dtype="auto",low_cpu_mem_usage=True, device_map='auto')

The model instanciation failed with

\lib\site-packages\transformers\modeling_utils.py", line 1245, in _get_resolved_checkpoint_files
    raise EnvironmentError(
OSError: nvidia/Mistral-7B-Instruct-v0.3-ONNX-INT4 does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
model = MistralForCausalLM.from_pretrained(repo_id,cache_dir="./model",low_cpu_mem_usage=True,torch_dtype="auto", device_map="auto")

Is there some proper options to infer the model ?

thanks

Sign up or log in to comment