update flash attention source
Browse files- requirements.txt +1 -1
requirements.txt
CHANGED
@@ -4,4 +4,4 @@ transformers
|
|
4 |
accelerate
|
5 |
qwen-vl-utils
|
6 |
gradio
|
7 |
-
flash-
|
|
|
4 |
accelerate
|
5 |
qwen-vl-utils
|
6 |
gradio
|
7 |
+
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|