marcorez8 commited on
Commit
8b78aa6
·
verified ·
1 Parent(s): 00ab6c6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -8
README.md CHANGED
@@ -1,12 +1,36 @@
1
  ---
2
- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
3
- # Doc / guide: https://huggingface.co/docs/hub/model-cards
4
- {}
 
 
 
 
 
 
 
 
 
 
5
  ---
6
 
7
- # flash-attn 2.7.4 for windows prebuilt wheels for nvidia blackwell cu128
8
- - Available for python 3.10 and 3.11
9
- - flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl
10
- - flash_attn-2.7.4.post1-cp311-cp311-win_amd64.whl
11
 
12
- Also works with previous generations, tested on 5090RTX and 3090RTX
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ # Model card metadata following Hugging Face specification:
3
+ # https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
4
+ # Documentation: https://huggingface.co/docs/hub/model-cards
5
+ license: apache-2.0
6
+ tags:
7
+ - flash-attention
8
+ - nvidia
9
+ - blackwell
10
+ - windows
11
+ - prebuilt-wheels
12
+ - python
13
+ - machine-learning
14
+ - deep-learning
15
  ---
16
 
17
+ # Flash-Attention 2.7.4 Prebuilt Wheels for NVIDIA Blackwell (cu128) on Windows
 
 
 
18
 
19
+ This repository provides prebuilt wheels for **Flash-Attention 2.7.4** optimized for NVIDIA Blackwell GPUs (cu128) on Windows systems. These wheels are compatible with Python 3.10 and 3.11, enabling seamless integration for high-performance attention mechanisms in deep learning workflows.
20
+
21
+ ## Available Wheels
22
+ - `flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl` (Python 3.10)
23
+ - `flash_attn-2.7.4.post1-cp311-cp311-win_amd64.whl` (Python 3.11)
24
+
25
+ ## Compatibility
26
+ The prebuilt wheels are designed for NVIDIA Blackwell GPUs but have been tested and confirmed compatible with previous-generation NVIDIA GPUs, including:
27
+ - NVIDIA RTX 5090
28
+ - NVIDIA RTX 3090
29
+
30
+ ## Installation
31
+ To install, use pip with the appropriate wheel for your Python version:
32
+
33
+ ```bash
34
+ pip install flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl
35
+ # or
36
+ pip install flash_attn-2.7.4.post1-cp311-cp311-win_amd64.whl