TWASR / requirements.txt
JacobLinCool's picture
feat: add flash-attn prebuilt wheel to requirements
825d00e
raw
history blame contribute delete
297 Bytes
gradio==5.20.1
huggingface_hub
transformers
accelerate
spaces
librosa
torch
torchvision
torchaudio
peft
scipy
backoff
# flash-attn (prebuilt)
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl