Pectics commited on
Commit
2d8a02a
·
1 Parent(s): 61f2a3d

update flash attention source

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -4,4 +4,4 @@ transformers
4
  accelerate
5
  qwen-vl-utils
6
  gradio
7
- flash-attn
 
4
  accelerate
5
  qwen-vl-utils
6
  gradio
7
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl