[Installation]: vllm-flash-attn's pytorch is not compatible with vllm 0.6.5's #11283
Closed
1 task done
Labels
installation
Installation problems
Your current environment
As the title,
vllm-flash-attn
pinstorch==2.4.0
butvllm
0.6.5 requirestorch==2.5.1
.How you are installing vllm
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: