patientx
a927fbd99b
Merge branch 'comfyanonymous:master' into master
2025-08-14 12:16:50 +03:00
patientx
e4f5b5f99f
Update README.md
2025-08-14 12:16:42 +03:00
Jedrzej Kosinski
e4f7ea105f
Added context window support to core sampling code ( #9238 )
...
* Added initial support for basic context windows - in progress
* Add prepare_sampling wrapper for context window to more accurately estimate latent memory requirements, fixed merging wrappers/callbacks dicts in prepare_model_patcher
* Made context windows compatible with different dimensions; works for WAN, but results are bad
* Fix comfy.patcher_extension.merge_nested_dicts calls in prepare_model_patcher in sampler_helpers.py
* Considering adding some callbacks to context window code to allow extensions of behavior without the need to rewrite code
* Made dim slicing cleaner
* Add Wan Context WIndows node for testing
* Made context schedule and fuse method functions be stored on the handler instead of needing to be registered in core code to be found
* Moved some code around between node_context_windows.py and context_windows.py
* Change manual context window nodes names/ids
* Added callbacks to IndexListContexHandler
* Adjusted default values for context_length and context_overlap, made schema.inputs definition for WAN Context Windows less annoying
* Make get_resized_cond more robust for various dim sizes
* Fix typo
* Another small fix
2025-08-13 21:33:05 -04:00
Simon Lui
c991a5da65
Fix XPU iGPU regressions ( #9322 )
...
* Change bf16 check and switch non-blocking to off default with option to force to regain speed on certain classes of iGPUs and refactor xpu check.
* Turn non_blocking off by default for xpu.
* Update README.md for Intel GPUs.
2025-08-13 19:13:35 -04:00
patientx
804c7097fa
Merge branch 'comfyanonymous:master' into master
2025-08-13 23:56:43 +03:00
comfyanonymous
9df8792d4b
Make last PR not crash comfy on old pytorch. ( #9324 )
2025-08-13 15:12:41 -04:00
contentis
3da5a07510
SDPA backend priority ( #9299 )
2025-08-13 14:53:27 -04:00
comfyanonymous
afa0a45206
Reduce portable size again. ( #9323 )
...
* compress more
* test
* not needed
2025-08-13 14:42:08 -04:00
patientx
a1a636c1c3
Merge branch 'comfyanonymous:master' into master
2025-08-13 12:02:39 +03:00
comfyanonymous
615eb52049
Put back frontend version. ( #9317 )
2025-08-13 03:48:06 -04:00
comfyanonymous
d5c1954d5c
ComfyUI version 0.3.50
2025-08-13 03:46:38 -04:00
comfyanonymous
e400f26c8f
Downgrade frontend for release. ( #9316 )
2025-08-13 03:44:54 -04:00
patientx
bcafc3f7a3
Merge branch 'comfyanonymous:master' into master
2025-08-13 10:36:20 +03:00
comfyanonymous
5ca8e2fac3
Update release workflow to python3.13 pytorch cu129 ( #9315 )
...
* Try to reduce size of portable even more.
* Update stable release workflow to python 3.13 cu129
* Update dependencies workflow to python3.13 cu129
2025-08-13 03:01:12 -04:00
ComfyUI Wiki
3294782d19
Update template to 0.1.59 ( #9313 )
2025-08-13 02:50:50 -04:00
Jedrzej Kosinski
898d88e10e
Make torchaudio exception catching less specific ( #9309 )
2025-08-12 23:34:58 -04:00
comfyanonymous
560d38f34c
Wan2.2 fun control support. ( #9292 )
2025-08-12 23:26:33 -04:00
comfyanonymous
e1d4f36d8d
Update test release package workflow with python 3.13 cu129. ( #9306 )
2025-08-12 20:13:04 -04:00
ComfyUI Wiki
1e3ae1eed8
Update template to 0.1.58 ( #9302 )
2025-08-12 17:14:27 -04:00
patientx
57897c579f
Add files via upload
2025-08-12 23:07:00 +03:00
patientx
f80a9bb674
Merge branch 'comfyanonymous:master' into master
2025-08-12 00:33:53 +03:00
Alexander Piskun
f4231a80b1
fix(Kling Image API Node): do not pass "image_type" when no image ( #9271 )
...
* fix(Kling Image API Node): do not pass "image_type" when no image
* fix(Kling Image API Node): raise client-side error when kling_v1 is used with reference image
2025-08-11 17:15:14 -04:00
PsychoLogicAu
2208aa616d
Support SimpleTuner lycoris lora for Qwen-Image ( #9280 )
2025-08-11 16:56:16 -04:00
ComfyUI Wiki
629b173837
Update template & embedded docs ( #9283 )
...
* Update template & embedded docs
* Update embedded docs to 0.2.6
2025-08-11 16:52:12 -04:00
Alexander Piskun
fa340add55
remove creation of non-used asyncio_loop ( #9284 )
2025-08-11 16:48:17 -04:00
patientx
8026a7738e
Merge branch 'comfyanonymous:master' into master
2025-08-11 19:40:26 +03:00
comfyanonymous
966f3a5206
Only show feature flags log when verbose. ( #9281 )
2025-08-11 05:53:01 -04:00
patientx
83bcc75cda
Merge pull request #250 from sfinktah/sfink-better-fix-for-numpy
...
A better fix for numpy.
2025-08-10 22:15:50 +03:00
patientx
1320d3a833
Update install-n.bat
2025-08-10 22:13:59 +03:00
patientx
755078c6da
Update README.md
2025-08-10 12:09:37 +03:00
patientx
c2686a3968
Merge branch 'comfyanonymous:master' into master
2025-08-10 12:09:19 +03:00
patientx
edfb60d613
Update README.md
2025-08-10 12:09:10 +03:00
comfyanonymous
0552de7c7d
Bump pytorch cuda and rocm versions in readme instructions. ( #9273 )
2025-08-10 05:03:47 -04:00
comfyanonymous
5828607ccf
Not sure if AMD actually support fp16 acc but it doesn't crash. ( #9258 )
2025-08-09 12:49:25 -04:00
Christopher Anderson
709daeca63
A better fix for numpy.
2025-08-09 04:49:15 +10:00
patientx
28fd4513fd
Create fixnumpy.bat
2025-08-08 11:51:44 +03:00
patientx
89499c6fae
Merge branch 'comfyanonymous:master' into master
2025-08-08 11:40:07 +03:00
comfyanonymous
735bb4bdb1
Users report gfx1201 is buggy on flux with pytorch attention. ( #9244 )
2025-08-08 04:21:00 -04:00
Alexander Piskun
bf2a1b5b1e
async API nodes ( #9129 )
...
* converted API nodes to async
* converted BFL API nodes to async
* fixed client bug; converted gemini, ideogram, minimax
* fixed client bug; converted openai nodes
* fixed client bug; converted moonvalley, pika nodes
* fixed client bug; converted kling, luma nodes
* converted pixverse, rodin nodes
* converted tripo, veo2
* converted recraft nodes
* add lost log_request_response call
2025-08-07 23:37:50 -04:00
Jedrzej Kosinski
42974a448c
_ui.py import torchaudio safety check ( #9234 )
...
* Added safety around torchaudio import in _ui.py
* Trusted cursor too much, fixed torchaudio bool
2025-08-07 17:54:09 -04:00
patientx
f590383b0f
Update install-for-older-amd.bat
2025-08-07 21:22:22 +03:00
patientx
ff66fa800b
Update fixforrx580.bat
2025-08-07 21:19:30 +03:00
patientx
60713b205d
Merge branch 'comfyanonymous:master' into master
2025-08-07 18:59:36 +03:00
comfyanonymous
05df2df489
Fix RepeatLatentBatch not working on multi dim latents. ( #9227 )
2025-08-07 11:20:40 -04:00
patientx
be2e20b610
Merge branch 'comfyanonymous:master' into master
2025-08-07 09:48:29 +03:00
Christian Byrne
37d620a6b8
Update frontend to v1.24.3 ( #9175 )
2025-08-06 19:52:39 -04:00
ComfyUI Wiki
32691b16f4
Update template to 0.1.52 ( #9206 )
2025-08-06 13:26:29 -04:00
patientx
8795ae98aa
Merge branch 'comfyanonymous:master' into master
2025-08-06 20:24:47 +03:00
flybirdxx
4c3e57b0ae
Fixed an issue where qwenLora could not be loaded properly. ( #9208 )
2025-08-06 13:23:11 -04:00
patientx
e40e08e494
Update wan2.2-cfz-workflow.json
2025-08-06 15:45:22 +03:00