-
-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: vllm: error: unrecognized arguments: --config #8273
Comments
Hey - is there an example somewhere that suggests we support loading from a I dont believe we currently support this. I do think launching with a |
I think this option should already be supported: You can also see this option in the script code here: |
Oh cool! Looks like this just got added |
#7737 is not released yet. if you want to use it, please use the per-commit wheel. see https://docs.vllm.ai/en/latest/getting_started/installation.html |
I think this is is v0.6.0 |
Aah, this explains a lot! Alright, thank you for the fast help! 👍 |
Hi guys,
I cannot serve an vllm server with a config.yaml file using
vllm serve model --config config.yaml
Regardsless of the specified path, I always get:
vllm: error: unrecognized arguments: --config
I tried quoting the path, I tried relative and absolute path and I also tried a wrong non-existing path. All with the error above.
The server runs just fine if I do not use the
--config
argument. The content of the .yaml file is just as in the documentationAny idea whats wrong here?
Update:
I tried to manually specify various command and got confusing results.
These work:
However, if I start the server via a python subprocess with
the commands
host
andport
are suddently unknown.Again, the server works just fine if no arguments are given.
Update 2:
The python commands work if we use
=
to assign the arguments:Your current environment
🐛 Describe the bug
Or similar:
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: