metadata
license: apache-2.0
tags:
- flash-attention
- flash-attn
- wheel
- windows
- python-3.13
- blackwell
- sm_120
- cuda-13
- cu130
Custom Flash Attention 2.8.3 Wheel for Windows + Python 3.13 + Blackwell
This is a from-source build of flash-attn v2.8.3 targeting:
- Windows 11 (win_amd64)
- Python 3.13 (cp313)
- Blackwell architecture (sm_120 support via TORCH_CUDA_ARCH_LIST=12.0)
- PyTorch with CUDA 13.0 (cu130) – built for RTX PRO 60000 Blackwell Workstation
Installation
pip install https://huggingface.co/IxaOne/flash-attn-blackwell-win-cp313/resolve/main/flash_attn-2.8.3-cp313-cp313-win_amd64.whl