I've deployed the model via HF Inference endpoints. When I try to send a multimodal query both via curl, code and the chat interface in the Inference Endpoints UI I get the following error:
ValueError: /repository is not a multimodal model.
· Sign up or log in to comment