File size: 648 Bytes
bf446da d5faaa2 83d13ba | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | ---
license: apache-2.0
tags:
- flash-attention
- flash-attn
- wheel
- windows
- python-3.13
- blackwell
- sm_120
- cuda-13
- cu130
---
# Custom Flash Attention 2.8.3 Wheel for Windows + Python 3.13 + Blackwell
This is a from-source build of `flash-attn` v2.8.3 targeting:
- Windows 11 (win_amd64)
- Python 3.13 (cp313)
- Blackwell architecture (sm_120 support via TORCH_CUDA_ARCH_LIST=12.0)
- PyTorch with CUDA 13.0 (cu130) – built for RTX PRO 60000 Blackwell Workstation
## Installation
```bash
pip install https://huggingface.co/IxaOne/flash-attn-blackwell-win-cp313/resolve/main/flash_attn-2.8.3-cp313-cp313-win_amd64.whl |