Update README.md

This commit is contained in:
patientx 2025-05-13 16:49:03 +03:00 committed by GitHub
parent 1494c9c46c
commit 3435a8dfc8
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -143,6 +143,8 @@ comfyui.bat
env\scripts\activate (enter)
pip install flash_attn-2.7.4.post1-py3-none-any.whl (enter)
--- To use sage-attention , you have to change the "--use-pytorch-cross-attention" to "--use-sage-attention". My advice is make a seperate batch file for all different attention types instead of changing them.
also for later when you need to repatch zluda (maybe a torch update etc.) you can use:
```bash