model_type not in config.json

#10
by plain174 - opened

transformers-4.52.0

model_name = "Qwen/Qwen2.5-14B"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)

bug:
-> 1149 raise ValueError(
1150 f"Unrecognized model in {pretrained_model_name_or_path}. "
1151 f"Should have a model_type key in its {CONFIG_NAME}, or contain one of the following strings "
1152 f"in its name: {', '.join(CONFIG_MAPPING.keys())}"
1153 )

i can't load this model

Sign up or log in to comment