File size: 297 Bytes
487ed33
 
f4c725a
 
c8672f7
33ae472
3cb4f09
 
 
ae60597
 
2a3f2ed
825d00e
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
gradio==5.20.1
huggingface_hub
transformers
accelerate
spaces
librosa
torch
torchvision
torchaudio
peft
scipy
backoff

# flash-attn (prebuilt)
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl