From 3435a8dfc8106f8595b611be602d0bc869d9dcd5 Mon Sep 17 00:00:00 2001 From: patientx Date: Tue, 13 May 2025 16:49:03 +0300 Subject: [PATCH] Update README.md --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index f989286f6..642c3951d 100644 --- a/README.md +++ b/README.md @@ -143,6 +143,8 @@ comfyui.bat env\scripts\activate (enter) pip install flash_attn-2.7.4.post1-py3-none-any.whl (enter) + --- To use sage-attention , you have to change the "--use-pytorch-cross-attention" to "--use-sage-attention". My advice is make a seperate batch file for all different attention types instead of changing them. + also for later when you need to repatch zluda (maybe a torch update etc.) you can use: ```bash