Commit Graph

5745 Commits

Author SHA1 Message Date
comfyanonymous
3d0003c24c ComfyUI version 0.3.69 2025-11-17 17:17:24 -05:00
patientx
debfd20af8
Update README.md 2025-11-16 15:29:11 +03:00
patientx
2f4d04b10a
Merge branch 'comfyanonymous:master' into master 2025-11-16 15:28:55 +03:00
patientx
04b98fe354
Update README.md 2025-11-16 15:28:46 +03:00
comfyanonymous
7d6103325e
Change ROCm nightly install command to 7.1 (#10764) 2025-11-16 03:01:14 -05:00
Alexander Piskun
2d4a08b717
Revert "chore(api-nodes): mark OpenAIDalle2 and OpenAIDalle3 nodes as deprecated (#10757)" (#10759)
This reverts commit 9a02382568.
2025-11-15 12:37:34 -08:00
patientx
849a7c390e
Merge branch 'comfyanonymous:master' into master 2025-11-15 22:20:44 +03:00
Alexander Piskun
9a02382568
chore(api-nodes): mark OpenAIDalle2 and OpenAIDalle3 nodes as deprecated (#10757) 2025-11-15 11:18:49 -08:00
patientx
850c5d5db5
Merge branch 'comfyanonymous:master' into master 2025-11-15 17:45:34 +03:00
comfyanonymous
bd01d9f7fd
Add left padding support to tokenizers. (#10753) 2025-11-15 06:54:40 -05:00
patientx
6c02c6b71f
Revise README for ComfyUI-ZLUDA updates
Updated README to reflect the new ComfyUI-ZLUDA version, highlighting AMD GPU support and installation instructions.
2025-11-14 12:55:58 +03:00
patientx
9f9ce655f2
Merge branch 'comfyanonymous:master' into master 2025-11-14 12:55:30 +03:00
patientx
385d102633
Revise README for ComfyUI project
Updated project name and description, added various badges and links, and revised installation instructions for ComfyUI.
2025-11-14 12:55:23 +03:00
comfyanonymous
443056c401
Fix custom nodes import error. (#10747)
This should fix the import errors but will break if the custom nodes actually try to use the class.
2025-11-14 03:26:05 -05:00
comfyanonymous
f60923590c
Use same code for chroma and flux blocks so that optimizations are shared. (#10746) 2025-11-14 01:28:05 -05:00
comfyanonymous
1ef328c007
Better instructions for the portable. (#10743) 2025-11-13 21:32:39 -05:00
rattus
94c298f962
flux: reduce VRAM usage (#10737)
Cleanup a bunch of stack tensors on Flux. This take me from B=19 to B=22
for 1600x1600 on RTX5090.
2025-11-13 16:02:03 -08:00
ric-yu
2fde9597f4
feat: add create_time dict to prompt field in /history and /queue (#10741) 2025-11-13 15:11:52 -08:00
patientx
66835171c6
Merge branch 'comfyanonymous:master' into master 2025-11-14 00:02:26 +03:00
Alexander Piskun
f91078b1ff
add PR template for API-Nodes (#10736) 2025-11-13 10:05:26 -08:00
patientx
5083c0225a
Update README.md 2025-11-13 13:23:41 +03:00
patientx
184fdd1103
Merge branch 'comfyanonymous:master' into master 2025-11-13 13:23:21 +03:00
patientx
bdbf42ee05
Update README.md 2025-11-13 13:23:13 +03:00
contentis
3b3ef9a77a
Quantized Ops fixes (#10715)
* offload support, bug fixes, remove mixins

* add readme
2025-11-12 18:26:52 -05:00
comfyanonymous
8b0b93df51
Update Python 3.14 compatibility notes in README (#10730) 2025-11-12 17:04:41 -05:00
patientx
9747b75093
Update README.md 2025-11-13 00:55:34 +03:00
patientx
00609a5102
Merge branch 'comfyanonymous:master' into master 2025-11-13 00:55:19 +03:00
patientx
f1c8217364
Update README.md 2025-11-13 00:55:13 +03:00
rattus
1c7eaeca10
qwen: reduce VRAM usage (#10725)
Clean up a bunch of stacked and no-longer-needed tensors on the QWEN
VRAM peak (currently FFN).

With this I go from OOMing at B=37x1328x1328 to being able to
succesfully run B=47 (RTX5090).
2025-11-12 16:20:53 -05:00
rattus
18e7d6dba5
mm/mp: always unload re-used but modified models (#10724)
The partial unloader path in model re-use flow skips straight to the
actual unload without any check of the patching UUID. This means that
if you do an upscale flow with a model patch on an existing model, it
will not apply your patchings.

Fix by delaying the partial_unload until after the uuid checks. This
is done by making partial_unload a model of partial_load where extra_mem
is -ve.
2025-11-12 16:19:53 -05:00
Qiacheng Li
e1d85e7577
Update README.md for Intel Arc GPU installation, remove IPEX (#10729)
IPEX is no longer needed for Intel Arc GPUs.  Removing instruction to setup ipex.
2025-11-12 15:21:05 -05:00
patientx
644778be49
Merge branch 'comfyanonymous:master' into master 2025-11-12 17:55:30 +03:00
comfyanonymous
1199411747
Don't pin tensor if not a torch.nn.parameter.Parameter (#10718) 2025-11-11 19:33:30 -05:00
patientx
4e14be5d8d
Merge branch 'comfyanonymous:master' into master 2025-11-11 14:02:40 +03:00
comfyanonymous
5ebcab3c7d
Update CI workflow to remove dead macOS runner. (#10704)
* Update CI workflow to remove dead macOS runner.

* revert

* revert
2025-11-10 15:35:29 -05:00
patientx
3662d0a2ce
Merge branch 'comfyanonymous:master' into master 2025-11-10 14:05:53 +03:00
rattus
c350009236
ops: Put weight cast on the offload stream (#10697)
This needs to be on the offload stream. This reproduced a black screen
with low resolution images on a slow bus when using FP8.
2025-11-09 22:52:11 -05:00
comfyanonymous
dea899f221
Unload weights if vram usage goes up between runs. (#10690) 2025-11-09 18:51:33 -05:00
comfyanonymous
e632e5de28
Add logging for model unloading. (#10692) 2025-11-09 18:06:39 -05:00
patientx
d56cb56059
Merge branch 'comfyanonymous:master' into master 2025-11-09 13:41:51 +03:00
comfyanonymous
2abd2b5c20
Make ScaleROPE node work on Flux. (#10686) 2025-11-08 15:52:02 -05:00
patientx
8e02689534
Merge branch 'comfyanonymous:master' into master 2025-11-07 20:30:21 +03:00
comfyanonymous
a1a70362ca
Only unpin tensor if it was pinned by ComfyUI (#10677) 2025-11-07 11:15:05 -05:00
patientx
d29dbbd829
Merge branch 'comfyanonymous:master' into master 2025-11-07 14:27:13 +03:00
rattus
cf97b033ee
mm: guard against double pin and unpin explicitly (#10672)
As commented, if you let cuda be the one to detect double pin/unpinning
it actually creates an asyc GPU error.
2025-11-06 21:20:48 -05:00
comfyanonymous
eb1c42f649
Tell users they need to upload their logs in bug reports. (#10671) 2025-11-06 20:24:28 -05:00
patientx
9d1f7c5eee
Update README.md 2025-11-06 15:35:55 +03:00
patientx
3ab45ae725
Merge branch 'comfyanonymous:master' into master 2025-11-06 15:35:41 +03:00
patientx
db70e60d01
Update README.md 2025-11-06 15:35:33 +03:00
comfyanonymous
e05c907126
Clarify release cycle. (#10667) 2025-11-06 04:11:30 -05:00