Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation]: vllm-flash-attn's pytorch is not compatible with vllm 0.6.5's #11283

Closed
1 task done
jfcherng opened this issue Dec 18, 2024 · 3 comments
Closed
1 task done
Labels
installation Installation problems

Comments

@jfcherng
Copy link

Your current environment

As the title, vllm-flash-attn pins torch==2.4.0 but vllm 0.6.5 requires torch==2.5.1.

How you are installing vllm

$ uv pip install vllm==0.6.5 vllm-flash-attn
  × No solution found when resolving dependencies:
  ╰─▶ Because vllm==0.6.5 depends on torch{platform_machine != 'aarch64'}==2.5.1 and
      vllm-flash-attn>=2.6.1 depends on torch==2.4.0, we can conclude that vllm==0.6.5 and
      all of:
          vllm-flash-attn==2.6.1
          vllm-flash-attn==2.6.2
       are incompatible. (1)

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@jfcherng jfcherng added the installation Installation problems label Dec 18, 2024
@noooop
Copy link
Contributor

noooop commented Dec 18, 2024

LOL

Turn to serious face:

from #8245

vllm-flash-attn is compiled together with vllm instead of separately.
So you do not need to install vllm-flash-attn separately.

Obviously, the subsequent cmake files in the vllm-flash-attn warehouse were not updated along with vllm,
so now there is no way to install vllm-flash-attn separately.

@jfcherng
Copy link
Author

vllm-flash-attn is compiled together with vllm instead of separately.

Even better. Thank you @noooop :)

@jfcherng
Copy link
Author

jfcherng commented Dec 18, 2024

I guess my issue has been resolved/clarified.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

No branches or pull requests

2 participants