Small changes in readme for Vllm serve
#154
by
Navanit-AI
- opened
The vllm serve has become simple and easy after so long. So kindly update this readme now.
Neither of the instructions from the official readme or this PR work universally.
The best bet would be to use uv pip install vllm==0.10.2 --torch-backend=auto as any future updates may brake compatibility with various dependencies.
In general, hugging face instructions are OK for newly released models. Later it makes more sense to just follow the official vllm docs https://docs.vllm.ai/projects/recipes/en/latest/OpenAI/GPT-OSS.html#gpt-oss-vllm-usage-guide