Small changes in readme for Vllm serve

#154

The vllm serve has become simple and easy after so long. So kindly update this readme now.

@dkundel-openai Kindly look into this minor changes done for maintainance.

Neither of the instructions from the official readme or this PR work universally.

The best bet would be to use uv pip install vllm==0.10.2 --torch-backend=auto as any future updates may brake compatibility with various dependencies.

In general, hugging face instructions are OK for newly released models. Later it makes more sense to just follow the official vllm docs https://docs.vllm.ai/projects/recipes/en/latest/OpenAI/GPT-OSS.html#gpt-oss-vllm-usage-guide

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment