Update README.md

This commit is contained in:
patientx 2025-06-02 17:57:24 +03:00 committed by GitHub
parent e22e77c094
commit a20f3a657f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -16,16 +16,14 @@ Windows-only version of ComfyUI which uses ZLUDA to get better performance with
- [Credits](#credits)
## What's New?
* Added "CFZ Cudnn Toggle" node, it is for some of the audio models, not working with cudnn -which is enabled by default on new install method- to use it just connect it before ksampler -latent_image input or any latent input- disable cudnn, THEN after the vae decoding -which most of these problems occur- to re-enable cudnn , add it after vae-decoding, select audio_output and connect it save audio node of course enable cudnn now.This way within that workflow you are disabling cudnn when working with models that are not compatible with it, so instead of completely disabling it in comfy we can do it locally like this.
* Added an experiment of mine, "CFZ Checkpoint Loader", a very basic quantizer for models. It only works -reliably- with SDXL and variants aka noobai or illustrious etc. It only works on the unet aka main model so no clips or vae. BUT it gives around %23 to %29 less vram usage with sdxl models. The generation time slows around 5 to 10 percent at most. This is especially good for low vram folks, 6GB - 8GB it could even be helpful for 4GB I don't know. Feel free to copy- modify- improve it, and try it with nvidia gpu's as well. Of course this fork is AMD only but you can take it and try it anywhere. Just you know I am not actively working on it, and besides SDXL cannot guarantee any vram improvements let alone a working node :) NOTE: It doesn't need any special packages or hardware so it probably would work with any gpu. Again, don't ask me to add x etc.
* HAD TO move it to main directory, putting it in custom_nodes on my end causes update problems so please copy the "https://github.com/patientx/ComfyUI-Zluda/blob/master/cfz_checkpoint_loader.py" into custom_nodes in your comfyui-zluda folder and restart comfy to see it. To use it search for "cfz"
* BOTH of these nodes are inside "cfz" folder, to use them copy them into custom_nodes, they would appear next time you open comfy, to find them searh for "cfz" you will see both nodes.
* flash-attention download error fixed, also added sage-attention fix, especially for vae out of memory errors that occurs a lot with sage-attention enabled. NOTE : this doesn't require any special packages or hardware as far as I know so it could work with everything.
* `install-n.bat` now not only installs everything needed for MIOPEN and Flash-Attention use, it also automates installing triton (only supported for python 3.10.x and 3.11.x) and flash-attention. So if you especially have 6000+ gpu , have HIP 6.2.4 and libraries if necessary, try it. But beware, there are lots of errors yet to be unsolved. So it is still not the default installed version.
* If you want to enable MIOPEN , Triton and Flash-Attention use the install-n.bat . This will install torch 2.7 , with latest nightly zluda and patch the correct files into comfyui-zluda. This features do not work very well and you are on your own if you want to try these. (---- You have to install HIP 6.2.4 - Download and extract HIP Addon inside the folder (information down below on installation section). ---)
* If you choose to install hip 6.2 instead of 5.7 (and make sure the path's are there) comfyui-zluda will install latest zluda not 3.8.4 like 5.7.1.
* Florance2 is now fixed , (probably some other nodes too) you need to disable "do_sample" meaning change it from True to False, now it would work without needing to edit it's node.
* Added onnxruntime fix so that it now uses cpu-only regardless of node. So now nodes like pulid, reactor, infiniteyou etc works without problems and can now use codeformer too.
* Added installing the new required package to be installed `comfyui-frontend-package` inside zluda patch, comfyui removed its web frontend from the main and it is now to be installed like the other packages, new installs will do this but previous users have to install it somehow. So here it is. It will install only one time and that would be brief. So don't be afraid if you see "Comfyui Frontend Package installed." in the commandline.
* Changed how ZLUDA is patched into the Comfyui itself. Now code is in external file , much cleaner and easier for me the change if it becomes necessary.
* Added a way to use any zluda you want (to use with HIP versions you want to use such as 6.1 - 6.2) After installing, close the app, run `patchzluda2.bat`. It will ask for url of the zluda build you want to use. You can choose from them here,
[lshyqqtiger's ZLUDA Fork](https://github.com/lshqqytiger/ZLUDA/releases) then you can use `patchzluda2.bat`, run it paste the link via right click (A correct link would be like this, `https://github.com/lshqqytiger/ZLUDA/releases/download/rel.d60bddbc870827566b3d2d417e00e1d2d8acc026/ZLUDA-windows-rocm6-amd64.zip`)
After pasting press enter and it would patch that zluda into comfy for you.