Does SmolVLM2-2.2B-Instruct Support Function Calling?

#26
by dgallitelli - opened

Hello! I am exploring the use of HuggingFaceTB/SmolVLM2-2.2B-Instruct for advanced LLM applications on Amazon SageMaker, specifically with both the DJL LMI container (using vLLM as backend) and the TGI container image. My goal is to leverage structured function calling capabilities—similar to those supported by other models like SmolLM2-1.7B-Instruct. However, I have not found explicit documentation or examples confirming that SmolVLM2-2.2B-Instruct has been fine-tuned or supports function calling out of the box. Could the maintainers or the community clarify if this model supports function calling, and if so, provide guidance on prompt formatting and integration with these SageMaker deployment options? Any tips or sample code for enabling structured tool use would be greatly appreciated!

Sign up or log in to comment