honglyua
0ccc88b03f
Support Iluvatar CoreX ( #8585 )
...
* Support Iluvatar CoreX
Co-authored-by: mingjiang.li <mingjiang.li@iluvatar.com>
2025-07-24 13:57:36 -04:00
patientx
30539d0d13
Merge branch 'comfyanonymous:master' into master
2025-07-24 13:59:09 +03:00
Kohaku-Blueleaf
eb2f78b4e0
[Training Node] algo support, grad acc, optional grad ckpt ( #9015 )
...
* Add factorization utils for lokr
* Add lokr train impl
* Add loha train impl
* Add adapter map for algo selection
* Add optional grad ckpt and algo selection
* Update __init__.py
* correct key name for loha
* Use custom fwd/bwd func and better init for loha
* Support gradient accumulation
* Fix bugs of loha
* use more stable init
* Add OFT training
* linting
2025-07-23 20:57:27 -04:00
chaObserv
e729a5cc11
Separate denoised and noise estimation in Euler CFG++ ( #9008 )
...
This will change their behavior with the sampling CONST type.
It also combines euler_cfg_pp and euler_ancestral_cfg_pp into one main function.
2025-07-23 19:47:05 -04:00
comfyanonymous
d3504e1778
Enable pytorch attention by default for gfx1201 on torch 2.8 ( #9029 )
2025-07-23 19:21:29 -04:00
comfyanonymous
a86a58c308
Fix xpu function not implemented p2. ( #9027 )
2025-07-23 18:18:20 -04:00
comfyanonymous
39dda1d40d
Fix xpu function not implemented. ( #9026 )
2025-07-23 18:10:59 -04:00
patientx
58f3250106
Merge branch 'comfyanonymous:master' into master
2025-07-23 23:49:46 +03:00
comfyanonymous
5ad33787de
Add default device argument. ( #9023 )
2025-07-23 14:20:49 -04:00
patientx
bd33a5d382
Merge branch 'comfyanonymous:master' into master
2025-07-23 03:27:52 +03:00
Simon Lui
255f139863
Add xpu version for async offload and some other things. ( #9004 )
2025-07-22 15:20:09 -04:00
patientx
b049c1df82
Merge branch 'comfyanonymous:master' into master
2025-07-17 00:49:42 +03:00
comfyanonymous
491fafbd64
Silence clip tokenizer warning. ( #8934 )
2025-07-16 14:42:07 -04:00
Harel Cain
9bc2798f72
LTXV VAE decoder: switch default padding mode ( #8930 )
2025-07-16 13:54:38 -04:00
patientx
d22d65cc68
Merge branch 'comfyanonymous:master' into master
2025-07-16 13:58:56 +03:00
comfyanonymous
50afba747c
Add attempt to work around the safetensors mmap issue. ( #8928 )
2025-07-16 03:42:17 -04:00
patientx
79e3b67425
Merge branch 'comfyanonymous:master' into master
2025-07-15 12:24:08 +03:00
Yoland Yan
543c24108c
Fix wrong reference bug ( #8910 )
2025-07-14 20:45:55 -04:00
patientx
3845c2ff7a
Merge branch 'comfyanonymous:master' into master
2025-07-12 14:59:05 +03:00
comfyanonymous
b40143984c
Add model detection error hint for lora. ( #8880 )
2025-07-12 03:49:26 -04:00
patientx
5ede75293f
Merge branch 'comfyanonymous:master' into master
2025-07-11 17:30:21 +03:00
comfyanonymous
938d3e8216
Remove windows line endings. ( #8866 )
2025-07-11 02:37:51 -04:00
patientx
43514805ed
Merge branch 'comfyanonymous:master' into master
2025-07-10 21:46:22 +03:00
guill
2b653e8c18
Support for async node functions ( #8830 )
...
* Support for async execution functions
This commit adds support for node execution functions defined as async. When
a node's execution function is defined as async, we can continue
executing other nodes while it is processing.
Standard uses of `await` should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.
In addition to the execute function, the `VALIDATE_INPUTS` and
`check_lazy_status` functions can also be defined as async, though we'll
only resolve one node at a time right now for those.
* Add the execution model tests to CI
* Add a missing file
It looks like this got caught by .gitignore? There's probably a better
place to put it, but I'm not sure what that is.
* Add the websocket library for automated tests
* Add additional tests for async error cases
Also fixes one bug that was found when an async function throws an error
after being scheduled on a task.
* Add a feature flags message to reduce bandwidth
We now only send 1 preview message of the latest type the client can
support.
We'll add a console warning when the client fails to send a feature
flags message at some point in the future.
* Add async tests to CI
* Don't actually add new tests in this PR
Will do it in a separate PR
* Resolve unit test in GPU-less runner
* Just remove the tests that GHA can't handle
* Change line endings to UNIX-style
* Avoid loading model_management.py so early
Because model_management.py has a top-level `logging.info`, we have to
be careful not to import that file before we call `setup_logging`. If we
do, we end up having the default logging handler registered in addition
to our custom one.
2025-07-10 14:46:19 -04:00
patientx
b621197729
Merge branch 'comfyanonymous:master' into master
2025-07-09 00:16:45 +03:00
chaObserv
aac10ad23a
Add SA-Solver sampler ( #8834 )
2025-07-08 16:17:06 -04:00
josephrocca
974254218a
Un-hardcode chroma patch_size ( #8840 )
2025-07-08 15:56:59 -04:00
patientx
ab04a1c165
Merge branch 'comfyanonymous:master' into master
2025-07-06 14:41:03 +03:00
comfyanonymous
75d327abd5
Remove some useless code. ( #8812 )
2025-07-06 07:07:39 -04:00
patientx
65db0a046f
Merge branch 'comfyanonymous:master' into master
2025-07-06 02:44:08 +03:00
comfyanonymous
ee615ac269
Add warning when loading file unsafely. ( #8800 )
2025-07-05 14:34:57 -04:00
patientx
94464d7867
Merge branch 'comfyanonymous:master' into master
2025-07-04 11:03:58 +03:00
chaObserv
f41f323c52
Add the denoising step to several samplers ( #8780 )
2025-07-03 19:20:53 -04:00
patientx
455fc30fd8
Merge branch 'comfyanonymous:master' into master
2025-07-03 08:08:09 +03:00
City
d9277301d2
Initial code for new SLG node ( #8759 )
2025-07-02 20:13:43 -04:00
patientx
ac99b100ef
Merge branch 'comfyanonymous:master' into master
2025-07-02 12:50:51 +03:00
comfyanonymous
111f583e00
Fallback to regular op when fp8 op throws exception. ( #8761 )
2025-07-02 00:57:13 -04:00
patientx
fa03718ba9
Merge branch 'comfyanonymous:master' into master
2025-07-01 14:19:30 +03:00
chaObserv
b22e97dcfa
Migrate ER-SDE from VE to VP algorithm and add its sampler node ( #8744 )
...
Apply alpha scaling in the algorithm for reverse-time SDE and add custom ER-SDE sampler node for other solver types (SDE, ODE).
2025-07-01 02:38:52 -04:00
patientx
b093eef244
Merge branch 'comfyanonymous:master' into master
2025-06-29 15:26:39 +03:00
comfyanonymous
170c7bb90c
Fix contiguous issue with pytorch nightly. ( #8729 )
2025-06-29 06:38:40 -04:00
patientx
82eaaa0d10
Merge branch 'comfyanonymous:master' into master
2025-06-29 12:45:29 +03:00
comfyanonymous
396454fa41
Reorder the schedulers so simple is the default one. ( #8722 )
2025-06-28 18:12:56 -04:00
patientx
95625f5bde
Merge branch 'comfyanonymous:master' into master
2025-06-28 22:39:52 +03:00
xufeng
ba9548f756
“--whitelist-custom-nodes” args for comfy core to go with “--disable-all-custom-nodes” for development purposes ( #8592 )
...
* feat: “--whitelist-custom-nodes” args for comfy core to go with “--disable-all-custom-nodes” for development purposes
* feat: Simplify custom nodes whitelist logic to use consistent code paths
2025-06-28 15:24:02 -04:00
patientx
14864643e6
Merge branch 'comfyanonymous:master' into master
2025-06-28 01:17:17 +03:00
comfyanonymous
c36be0ea09
Fix memory estimation bug with kontext. ( #8709 )
2025-06-27 17:21:12 -04:00
patientx
4ab5a4bb6b
Merge branch 'comfyanonymous:master' into master
2025-06-27 21:54:55 +03:00
comfyanonymous
9093301a49
Don't add tiny bit of random noise when VAE encoding. ( #8705 )
...
Shouldn't change outputs but might make things a tiny bit more
deterministic.
2025-06-27 14:14:56 -04:00
patientx
76486ca342
Merge branch 'comfyanonymous:master' into master
2025-06-26 18:53:44 +03:00
comfyanonymous
ef5266b1c1
Support Flux Kontext Dev model. ( #8679 )
2025-06-26 11:28:41 -04:00
patientx
c1ce148f41
Merge branch 'comfyanonymous:master' into master
2025-06-26 11:34:14 +03:00
comfyanonymous
a96e65df18
Disable omnigen2 fp16 on older pytorch versions. ( #8672 )
2025-06-26 03:39:09 -04:00
patientx
7db0737e86
Merge branch 'comfyanonymous:master' into master
2025-06-26 02:41:07 +03:00
comfyanonymous
ec70ed6aea
Omnigen2 model implementation. ( #8669 )
2025-06-25 19:35:57 -04:00
patientx
8ad7604b12
Add files via upload
2025-06-26 01:45:32 +03:00
patientx
65ff46f01e
Merge branch 'comfyanonymous:master' into master
2025-06-25 12:14:05 +03:00
comfyanonymous
7a13f74220
unet -> diffusion model ( #8659 )
2025-06-25 04:52:34 -04:00
patientx
71a47f608a
Merge branch 'comfyanonymous:master' into master
2025-06-24 23:49:15 +03:00
chaObserv
8042eb20c6
Singlestep DPM++ SDE for RF ( #8627 )
...
Refactor the algorithm, and apply alpha scaling.
2025-06-24 14:59:09 -04:00
patientx
2bf05370ad
Merge branch 'comfyanonymous:master' into master
2025-06-21 14:54:22 +03:00
comfyanonymous
1883e70b43
Fix exception when using a noise mask with cosmos predict2. ( #8621 )
...
* Fix exception when using a noise mask with cosmos predict2.
* Fix ruff.
2025-06-21 03:30:39 -04:00
patientx
0e967d11b1
Merge branch 'comfyanonymous:master' into master
2025-06-20 16:18:48 +03:00
comfyanonymous
f7fb193712
Small flux optimization. ( #8611 )
2025-06-20 05:37:32 -04:00
patientx
73dfb38f5b
Merge branch 'comfyanonymous:master' into master
2025-06-20 09:44:52 +03:00
comfyanonymous
7e9267fa77
Make flux controlnet work with sd3 text enc. ( #8599 )
2025-06-19 18:50:05 -04:00
patientx
c46cc59288
Merge branch 'comfyanonymous:master' into master
2025-06-19 19:02:07 +03:00
comfyanonymous
91d40086db
Fix pytorch warning. ( #8593 )
2025-06-19 11:04:52 -04:00
patientx
e34ff57933
Add files via upload
2025-06-17 00:09:40 +03:00
patientx
4339cbea46
Merge branch 'comfyanonymous:master' into master
2025-06-16 22:21:35 +03:00
chaObserv
8e81c507d2
Multistep DPM++ SDE samplers for RF ( #8541 )
...
Include alpha in sampling and minor refactoring
2025-06-16 14:47:10 -04:00
comfyanonymous
e1c6dc720e
Allow setting min_length with tokenizer_data. ( #8547 )
2025-06-16 13:43:52 -04:00
patientx
e8327ffc3d
Merge branch 'comfyanonymous:master' into master
2025-06-15 21:42:40 +03:00
comfyanonymous
7ea79ebb9d
Add correct eps to ltxv rmsnorm. ( #8542 )
2025-06-15 12:21:25 -04:00
patientx
031f2f5120
Merge branch 'comfyanonymous:master' into master
2025-06-15 16:23:38 +03:00
comfyanonymous
d6a2137fc3
Support Cosmos predict2 image to video models. ( #8535 )
...
Use the CosmosPredict2ImageToVideoLatent node.
2025-06-14 21:37:07 -04:00
patientx
132d593223
Merge branch 'comfyanonymous:master' into master
2025-06-15 03:01:50 +03:00
chaObserv
53e8d8193c
Generalize SEEDS samplers ( #8529 )
...
Restore VP algorithm for RF and refactor noise_coeffs and half-logSNR calculations
2025-06-14 16:58:16 -04:00
patientx
8909c12ea9
Update model_management.py
2025-06-14 13:27:49 +03:00
patientx
8efd441de9
Merge branch 'comfyanonymous:master' into master
2025-06-14 12:43:19 +03:00
comfyanonymous
29596bd53f
Small cosmos attention code refactor. ( #8530 )
2025-06-14 05:02:05 -04:00
Kohaku-Blueleaf
520eb77b72
LoRA Trainer: LoRA training node in weight adapter scheme ( #8446 )
2025-06-13 19:25:59 -04:00
patientx
e3720cd495
Merge branch 'comfyanonymous:master' into master
2025-06-13 15:04:05 +03:00
comfyanonymous
c69af655aa
Uncap cosmos predict2 res and fix mem estimation. ( #8518 )
2025-06-13 07:30:18 -04:00
comfyanonymous
251f54a2ad
Basic initial support for cosmos predict2 text to image 2B and 14B models. ( #8517 )
2025-06-13 07:05:23 -04:00
patientx
a5e9b6729c
Update zluda.py
2025-06-13 01:40:34 +03:00
patientx
7bd5bcd135
Update zluda-default.py
2025-06-13 01:40:11 +03:00
patientx
c06a15d8a5
Update zluda.py
2025-06-13 01:03:45 +03:00
patientx
896bda9003
Update zluda-default.py
2025-06-13 01:03:14 +03:00
patientx
7d5f4074b6
Add files via upload
2025-06-12 16:10:32 +03:00
patientx
79dda39260
Update zluda.py
2025-06-12 13:20:24 +03:00
patientx
08784dc90d
Update zluda.py
2025-06-12 13:19:59 +03:00
patientx
11af025690
Update zluda.py
2025-06-12 13:11:03 +03:00
patientx
828b7636d0
Update zluda.py
2025-06-12 13:10:40 +03:00
patientx
f53791d5d2
Merge branch 'comfyanonymous:master' into master
2025-06-12 00:32:55 +03:00
pythongosssss
50c605e957
Add support for sqlite database ( #8444 )
...
* Add support for sqlite database
* fix
2025-06-11 16:43:39 -04:00
comfyanonymous
8a4ff747bd
Fix mistake in last commit. ( #8496 )
...
* Move to right place.
2025-06-11 15:13:29 -04:00
comfyanonymous
af1eb58be8
Fix black images on some flux models in fp16. ( #8495 )
2025-06-11 15:09:11 -04:00
patientx
06ac233007
Merge branch 'comfyanonymous:master' into master
2025-06-10 20:34:42 +03:00
comfyanonymous
6e28a46454
Apple most likely is never fixing the fp16 attention bug. ( #8485 )
2025-06-10 13:06:24 -04:00
patientx
4bc3866c67
Merge branch 'comfyanonymous:master' into master
2025-06-09 21:10:00 +03:00
comfyanonymous
7f800d04fa
Enable AMD fp8 and pytorch attention on some GPUs. ( #8474 )
...
Information is from the pytorch source code.
2025-06-09 12:50:39 -04:00
patientx
b4d015f5f3
Merge branch 'comfyanonymous:master' into master
2025-06-08 21:21:41 +03:00
comfyanonymous
97755eed46
Enable fp8 ops by default on gfx1201 ( #8464 )
2025-06-08 14:15:34 -04:00
patientx
156aedd995
Merge branch 'comfyanonymous:master' into master
2025-06-07 19:30:45 +03:00
comfyanonymous
daf9d25ee2
Cleaner torch version comparisons. ( #8453 )
2025-06-07 10:01:15 -04:00
patientx
d28b4525b3
Merge branch 'comfyanonymous:master' into master
2025-06-06 17:10:34 +03:00
comfyanonymous
3b4b171e18
Alternate fix for #8435 ( #8442 )
2025-06-06 09:43:27 -04:00
patientx
67fc8e3325
Merge branch 'comfyanonymous:master' into master
2025-06-06 01:42:57 +03:00
comfyanonymous
4248b1618f
Let chroma TE work on regular flux. ( #8429 )
2025-06-05 10:07:17 -04:00
patientx
9aeff135b2
Update zluda.py
2025-06-02 02:55:19 +03:00
patientx
803f82189a
Merge branch 'comfyanonymous:master' into master
2025-06-01 17:44:48 +03:00
comfyanonymous
fb4754624d
Make the casting in lists the same as regular inputs. ( #8373 )
2025-06-01 05:39:54 -04:00
comfyanonymous
19e45e9b0e
Make it easier to pass lists of tensors to models. ( #8358 )
2025-05-31 20:00:20 -04:00
patientx
d74ffb792a
Merge branch 'comfyanonymous:master' into master
2025-05-31 01:55:42 +03:00
drhead
08b7cc7506
use fused multiply-add pointwise ops in chroma ( #8279 )
2025-05-30 18:09:54 -04:00
patientx
07b8d211e6
Merge branch 'comfyanonymous:master' into master
2025-05-30 23:48:15 +03:00
comfyanonymous
704fc78854
Put ROCm version in tuple to make it easier to enable stuff based on it. ( #8348 )
2025-05-30 15:41:02 -04:00
patientx
c74742444d
Merge branch 'comfyanonymous:master' into master
2025-05-29 18:29:06 +03:00
comfyanonymous
f2289a1f59
Delete useless file. ( #8327 )
2025-05-29 08:29:37 -04:00
patientx
46a997fb23
Merge branch 'comfyanonymous:master' into master
2025-05-29 10:56:01 +03:00
comfyanonymous
5e5e46d40c
Not really tested WAN Phantom Support. ( #8321 )
2025-05-28 23:46:15 -04:00
comfyanonymous
1c1687ab1c
Support HiDream SimpleTuner loras. ( #8318 )
2025-05-28 18:47:15 -04:00
patientx
5b5165371e
Merge branch 'comfyanonymous:master' into master
2025-05-28 01:06:44 +03:00
comfyanonymous
06c661004e
Memory estimation code can now take into account conds. ( #8307 )
2025-05-27 15:09:05 -04:00
patientx
8609a6dced
Merge branch 'comfyanonymous:master' into master
2025-05-27 01:03:35 +03:00
comfyanonymous
89a84e32d2
Disable initial GPU load when novram is used. ( #8294 )
2025-05-26 16:39:27 -04:00
patientx
bbcb33ea72
Merge branch 'comfyanonymous:master' into master
2025-05-26 16:26:39 +03:00
comfyanonymous
e5799c4899
Enable pytorch attention by default on AMD gfx1151 ( #8282 )
2025-05-26 04:29:25 -04:00
patientx
48bbdd0842
Merge branch 'comfyanonymous:master' into master
2025-05-25 15:38:51 +03:00
comfyanonymous
a0651359d7
Return proper error if diffusion model not detected properly. ( #8272 )
2025-05-25 05:28:11 -04:00
patientx
9790aaac7b
Merge branch 'comfyanonymous:master' into master
2025-05-24 14:00:54 +03:00
comfyanonymous
5a87757ef9
Better error if sageattention is installed but a dependency is missing. ( #8264 )
2025-05-24 06:43:12 -04:00
patientx
3b69a08c08
Merge branch 'comfyanonymous:master' into master
2025-05-24 04:06:28 +03:00
comfyanonymous
0b50d4c0db
Add argument to explicitly enable fp8 compute support. ( #8257 )
...
This can be used to test if your current GPU/pytorch version supports fp8 matrix mult in combination with --fast or the fp8_e4m3fn_fast dtype.
2025-05-23 17:43:50 -04:00
patientx
3e49c3e2ff
Merge branch 'comfyanonymous:master' into master
2025-05-24 00:01:56 +03:00
drhead
30b2eb8a93
create arange on-device ( #8255 )
2025-05-23 16:15:06 -04:00
patientx
c653935b37
Merge branch 'comfyanonymous:master' into master
2025-05-23 13:57:11 +03:00
LuXuxue
dc4958db54
add some architectures to utils.py
2025-05-23 13:54:03 +08:00
comfyanonymous
f85c08df06
Make VACE conditionings stackable. ( #8240 )
2025-05-22 19:22:26 -04:00
comfyanonymous
87f9130778
Revert "This doesn't seem to be needed on chroma. ( #8209 )" ( #8210 )
...
This reverts commit 7e84bf5373 .
2025-05-20 05:39:55 -04:00
comfyanonymous
7e84bf5373
This doesn't seem to be needed on chroma. ( #8209 )
2025-05-20 05:29:23 -04:00
patientx
acf83b60f4
Merge branch 'comfyanonymous:master' into master
2025-05-18 11:37:17 +03:00
comfyanonymous
aee2908d03
Remove useless log. ( #8166 )
2025-05-17 06:27:34 -04:00
patientx
de4e3dd19a
Merge branch 'comfyanonymous:master' into master
2025-05-16 02:43:05 +03:00
comfyanonymous
1c2d45d2b5
Fix typo in last PR. ( #8144 )
...
More robust model detection for future proofing.
2025-05-15 19:02:19 -04:00
George0726
c820ef950d
Add Wan-FUN Camera Control models and Add WanCameraImageToVideo node ( #8013 )
...
* support wan camera models
* fix by ruff check
* change camera_condition type; make camera_condition optional
* support camera trajectory nodes
* fix camera direction
---------
Co-authored-by: Qirui Sun <sunqr0667@126.com>
2025-05-15 19:00:43 -04:00
patientx
1cf25c6980
Add files via upload
2025-05-15 13:55:15 +03:00
patientx
44cac886c4
Create quant_per_block.py
2025-05-15 13:54:47 +03:00
patientx
01aae8eddc
Create fwd_prefill.py
2025-05-15 13:52:35 +03:00