mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2026-01-11 14:50:49 +08:00
Update README.md
This commit is contained in:
parent
a8ddea2271
commit
6ff5a91244
@ -16,6 +16,7 @@ Windows-only version of ComfyUI which uses ZLUDA to get better performance with
|
||||
- [Credits](#credits)
|
||||
|
||||
## What's New?
|
||||
* Added an experiment of mine, "CFZ Checkpoint Loader", a very basic quantizer for models. It only works -reliably- with SDXL and variants aka noobai or illustrious etc. It only works on the unet aka main model so no clips or vae. BUT it gives around %23 to %29 less vram usage with sdxl models. The generation time slows around 5 to 10 percent at most. This is especially good for low vram folks, 6GB - 8GB it could even be helpful for 4GB I don't know. Feel free to copy- modify- improve it, and try it with nvidia gpu's as well. Of course this fork is AMD only but you can take it and try it anywhere. Just you know I am not actively working on it, and besides SDXL cannot guarantee any vram improvements let alone a working node :)
|
||||
* flash-attention download error fixed, also added sage-attention fix, especially for vae out of memory errors that occurs a lot with sage-attention enabled.
|
||||
* `install-n.bat` now not only installs everything needed for MIOPEN and Flash-Attention use, it also automates installing triton (only supported for python 3.10.x and 3.11.x) and flash-attention. So if you especially have 6000+ gpu , have HIP 6.2.4 and libraries if necessary, try it. But beware, there are lots of errors yet to be unsolved. So it is still not the default installed version.
|
||||
* If you want to enable MIOPEN , Triton and Flash-Attention use the install-n.bat . This will install torch 2.7 , with latest nightly zluda and patch the correct files into comfyui-zluda. This features do not work very well and you are on your own if you want to try these. (---- You have to install HIP 6.2.4 - Download and extract HIP Addon inside the folder (information down below on installation section). ---)
|
||||
|
||||
Loading…
Reference in New Issue
Block a user