Commit Graph

4896 Commits

Author SHA1 Message Date
patientx
576e200a3f
Update install-legacy.bat 2025-08-25 16:42:20 +03:00
patientx
b65516799e
Update install-for-older-amd.bat 2025-08-25 16:40:45 +03:00
patientx
e0a2d8d049
Update README.md 2025-08-25 16:34:30 +03:00
patientx
0ccee955d1
Update README.md 2025-08-25 12:44:14 +03:00
patientx
2a02479e05
Add files via upload 2025-08-25 12:41:24 +03:00
patientx
cbeac36b57
Merge pull request #272 from sfinktah/sfink-cudnn-env
Added env var TORCH_BACKENDS_CUDNN_ENABLED, defaults to 1.
2025-08-25 11:56:37 +03:00
Christopher Anderson
1b9a3b12c2 had to move cudnn disablement up much higher 2025-08-25 14:11:54 +10:00
Christopher Anderson
cd3d60254b argggh, white space hell 2025-08-25 09:52:58 +10:00
Christopher Anderson
184fa5921f worst PR ever, really. 2025-08-25 09:42:27 +10:00
Christopher Anderson
33c43b68c3 worst PR ever 2025-08-25 09:38:22 +10:00
Christopher Anderson
2a06dc8e87 Merge remote-tracking branch 'origin/sfink-cudnn-env' into sfink-cudnn-env
# Conflicts:
#	comfy/customzluda/zluda.py
2025-08-25 09:34:32 +10:00
Christopher Anderson
3504eeeb4a rebased onto upstream master (woops) 2025-08-25 09:32:34 +10:00
Christopher Anderson
7eda4587be Added env var TORCH_BACKENDS_CUDNN_ENABLED, defaults to 1. 2025-08-25 09:31:12 +10:00
Christopher Anderson
954644ef83 Added env var TORCH_BACKENDS_CUDNN_ENABLED, defaults to 1. 2025-08-25 08:56:48 +10:00
patientx
972495d95b
Merge pull request #271 from Rando717/rando-zluda.py
Update zluda.py (MEM_BUS_WIDTH)
2025-08-24 23:48:21 +03:00
patientx
09811a22d4
Update README.md 2025-08-24 23:47:13 +03:00
patientx
3a00627363
Merge branch 'comfyanonymous:master' into master 2025-08-24 23:47:00 +03:00
patientx
da8feee425
Update README.md 2025-08-24 23:46:47 +03:00
comfyanonymous
f6b93d41a0
Remove models from readme that are not fully implemented. (#9535)
Cosmos model implementations are currently missing the safety part so it is technically not fully implemented and should not be advertised as such.
2025-08-24 15:40:32 -04:00
blepping
95ac7794b7
Fix EasyCache/LazyCache crash when tensor shape/dtype/device changes during sampling (#9528)
* Fix EasyCache/LazyCache crash when tensor shape/dtype/device changes during sampling

* Fix missing LazyCache check_metadata method
Ensure LazyCache reset method resets all the tensor state values
2025-08-24 15:29:49 -04:00
Rando717
053a6b95e5
Update zluda.py (MEM_BUS_WIDTH)
Added more cards, mostly RDNA(1) and Radeon Pro.

Reasoning: Every time zluda.py gets update I have to manually add 256 for my RX 5700, otherwise it default to 128. Also, manual local edits fails at git pull.
2025-08-24 18:39:40 +02:00
patientx
c92a07594b
Update zluda.py 2025-08-24 12:01:20 +03:00
patientx
dba9d20791
Update zluda.py 2025-08-24 10:23:30 +03:00
patientx
8d6defbe21
Merge branch 'comfyanonymous:master' into master 2025-08-24 03:04:59 +03:00
comfyanonymous
71ed4a399e ComfyUI version 0.3.52 2025-08-23 18:57:09 -04:00
patientx
cbf2035bfc
Update README.md 2025-08-24 01:49:06 +03:00
patientx
4ac02dcde8
Update patchzluda.bat 2025-08-24 01:44:16 +03:00
patientx
bd3da62653
Update patchzluda-n.bat 2025-08-24 01:13:12 +03:00
Christian Byrne
3e316c6338
Update frontend to v1.25.10 and revert navigation mode override (#9522)
- Update comfyui-frontend-package from 1.25.9 to 1.25.10
- Revert forced legacy navigation mode from PR #9518
- Frontend v1.25.10 includes proper navigation mode fixes and improved display text
2025-08-23 17:54:01 -04:00
patientx
9904a181e0
Update README.md 2025-08-24 00:51:48 +03:00
patientx
4187f4a942
Merge branch 'comfyanonymous:master' into master 2025-08-24 00:51:35 +03:00
patientx
829932b717
Update README.md 2025-08-24 00:51:28 +03:00
comfyanonymous
8be0d22ab7
Don't use the annoying new navigation mode by default. (#9518) 2025-08-23 13:56:17 -04:00
patientx
a5cf0d6adc
Add files via upload 2025-08-23 18:41:14 +03:00
comfyanonymous
59eddda900
Python 3.13 is well supported. (#9511) 2025-08-23 01:36:44 -04:00
patientx
cdc04b5a8a
Merge branch 'comfyanonymous:master' into master 2025-08-23 07:47:07 +03:00
comfyanonymous
41048c69b4
Fix Conditioning masks on 3d latents. (#9506) 2025-08-22 23:15:44 -04:00
Jedrzej Kosinski
fc247150fe
Implement EasyCache and Invent LazyCache (#9496)
* Attempting a universal implementation of EasyCache, starting with flux as test; I screwed up the math a bit, but when I set it just right it works.

* Fixed math to make threshold work as expected, refactored code to use EasyCacheHolder instead of a dict wrapped by object

* Use sigmas from transformer_options instead of timesteps to be compatible with a greater amount of models, make end_percent work

* Make log statement when not skipping useful, preparing for per-cond caching

* Added DIFFUSION_MODEL wrapper around forward function for wan model

* Add subsampling for heuristic inputs

* Add subsampling to output_prev (output_prev_subsampled now)

* Properly consider conds in EasyCache logic

* Created SuperEasyCache to test what happens if caching and reuse is moved outside the scope of conds, added PREDICT_NOISE wrapper to facilitate this test

* Change max reuse_threshold to 3.0

* Mark EasyCache/SuperEasyCache as experimental (beta)

* Make Lumina2 compatible with EasyCache

* Add EasyCache support for Qwen Image

* Fix missing comma, curse you Cursor

* Add EasyCache support to AceStep

* Add EasyCache support to Chroma

* Added EasyCache support to Cosmos Predict t2i

* Make EasyCache not crash with Cosmos Predict ImagToVideo latents, but does not work well at all

* Add EasyCache support to hidream

* Added EasyCache support to hunyuan video

* Added EasyCache support to hunyuan3d

* Added EasyCache support to LTXV (not very good, but does not crash)

* Implemented EasyCache for aura_flow

* Renamed SuperEasyCache to LazyCache, hardcoded subsample_factor to 8 on nodes

* Eatra logging when verbose is true for EasyCache
2025-08-22 22:41:08 -04:00
contentis
fe31ad0276
Add elementwise fusions (#9495)
* Add elementwise fusions

* Add addcmul pattern to Qwen
2025-08-22 19:39:15 -04:00
ComfyUI Wiki
ca4e96a8ae
Update template to 0.1.65 (#9501) 2025-08-22 17:40:18 -04:00
patientx
7d457bc3d0
Merge branch 'comfyanonymous:master' into master 2025-08-22 23:49:32 +03:00
Alexander Piskun
050c67323c
feat(api-nodes): add copy button to Gemini Chat node (#9440) 2025-08-22 10:51:14 -07:00
Alexander Piskun
497d41fb50
feat(api-nodes): change "OpenAI Chat" display name to "OpenAI ChatGPT" (#9443) 2025-08-22 10:50:35 -07:00
patientx
8a120713c2
Update README.md 2025-08-22 18:58:08 +03:00
patientx
1a46afdb50
Update README.md 2025-08-22 18:38:14 +03:00
patientx
f3a326392f
Update patchzluda-n.bat 2025-08-22 18:37:34 +03:00
patientx
abd06334bf
Update install-n.bat 2025-08-22 18:35:31 +03:00
patientx
8d3ea27627
Update and rename install.bat to install-legacy.bat 2025-08-22 18:33:59 +03:00
patientx
0615fbddc4
Update patchzluda-n.bat 2025-08-22 15:53:13 +03:00
patientx
d5d5ccec13
Update install-n.bat 2025-08-22 15:49:27 +03:00