patientx
b8ab0f2091
Merge branch 'comfyanonymous:master' into master
2025-02-04 12:32:22 +03:00
comfyanonymous
3e880ac709
Fix on python 3.9
2025-02-04 04:20:56 -05:00
comfyanonymous
e5ea112a90
Support Lumina 2 model.
2025-02-04 04:16:30 -05:00
patientx
bf081a208a
Merge branch 'comfyanonymous:master' into master
2025-02-02 18:27:25 +03:00
comfyanonymous
44e19a28d3
Use maximum negative value instead of -inf for masks in text encoders.
...
This is probably more correct.
2025-02-02 09:46:00 -05:00
patientx
a5eb46e557
Merge branch 'comfyanonymous:master' into master
2025-02-02 17:30:16 +03:00
Dr.Lt.Data
0a0df5f136
better guide message for sageattention ( #6634 )
2025-02-02 09:26:47 -05:00
KarryCharon
24d6871e47
add disable-compres-response-body cli args; add compress middleware; ( #6672 )
2025-02-02 09:24:55 -05:00
patientx
2ccb7dd301
Merge branch 'comfyanonymous:master' into master
2025-02-01 15:33:58 +03:00
comfyanonymous
9e1d301129
Only use stable cascade lora format with cascade model.
2025-02-01 06:35:22 -05:00
patientx
97f5a7d844
Merge branch 'comfyanonymous:master' into master
2025-01-30 23:03:44 +03:00
comfyanonymous
8d8dc9a262
Allow batch of different sigmas when noise scaling.
2025-01-30 06:49:52 -05:00
patientx
98cf486504
Merge branch 'comfyanonymous:master' into master
2025-01-29 18:53:00 +03:00
filtered
222f48c0f2
Allow changing folder_paths.base_path via command line argument. ( #6600 )
...
* Reimpl. CLI arg directly inside folder_paths.
* Update tests to use CLI arg mocking.
* Revert last-minute refactor.
* Fix test state polution.
2025-01-29 08:06:28 -05:00
patientx
a5c944f482
Merge branch 'comfyanonymous:master' into master
2025-01-28 19:20:13 +03:00
comfyanonymous
13fd4d6e45
More friendly error messages for corrupted safetensors files.
2025-01-28 09:41:09 -05:00
patientx
f77fea7fc6
Merge branch 'comfyanonymous:master' into master
2025-01-27 23:45:19 +03:00
comfyanonymous
255edf2246
Lower minimum ratio of loaded weights on Nvidia.
2025-01-27 05:26:51 -05:00
patientx
1442e34d9e
Merge branch 'comfyanonymous:master' into master
2025-01-26 15:27:37 +03:00
comfyanonymous
67feb05299
Remove redundant code.
2025-01-25 19:04:53 -05:00
patientx
73433fffa0
Merge branch 'comfyanonymous:master' into master
2025-01-24 15:16:37 +03:00
comfyanonymous
14ca5f5a10
Remove useless code.
2025-01-24 06:15:54 -05:00
patientx
076d037620
Merge branch 'comfyanonymous:master' into master
2025-01-23 13:56:49 +03:00
comfyanonymous
96e2a45193
Remove useless code.
2025-01-23 05:56:23 -05:00
Chenlei Hu
dfa2b6d129
Remove unused function lcm in conds.py ( #6572 )
2025-01-23 05:54:09 -05:00
patientx
d9515843d0
Merge branch 'comfyanonymous:master' into master
2025-01-23 02:28:08 +03:00
comfyanonymous
d6bbe8c40f
Remove support for python 3.8.
2025-01-22 17:04:30 -05:00
patientx
fc09aa398c
Merge branch 'comfyanonymous:master' into master
2025-01-22 15:04:13 +03:00
chaObserv
e857dd48b8
Add gradient estimation sampler ( #6554 )
2025-01-22 05:29:40 -05:00
patientx
3402d28bcd
Merge branch 'comfyanonymous:master' into master
2025-01-21 00:06:28 +03:00
comfyanonymous
fb2ad645a3
Add FluxDisableGuidance node to disable using the guidance embed.
2025-01-20 14:50:24 -05:00
patientx
39e164a441
Merge branch 'comfyanonymous:master' into master
2025-01-20 19:27:59 +03:00
comfyanonymous
d8a7a32779
Cleanup old TODO.
2025-01-20 03:44:13 -05:00
patientx
df418fc3fd
added rocm and hip hiding for certain nodes getting errors under zluda
2025-01-19 17:49:55 +03:00
patientx
88eae2b0d1
Merge branch 'comfyanonymous:master' into master
2025-01-19 15:11:53 +03:00
Sergii Dymchenko
ebf038d4fa
Use torch.special.expm1 ( #6388 )
...
* Use `torch.special.expm1`
This function provides greater precision than `exp(x) - 1` for small values of `x`.
Found with TorchFix https://github.com/pytorch-labs/torchfix/
* Use non-alias
2025-01-19 04:54:32 -05:00
patientx
8ee302c9dd
Merge branch 'comfyanonymous:master' into master
2025-01-19 02:40:53 +03:00
catboxanon
b1a02131c9
Remove comfy.samplers self-import ( #6506 )
2025-01-18 17:49:51 -05:00
patientx
e3207a0560
Merge branch 'comfyanonymous:master' into master
2025-01-18 16:08:27 +03:00
comfyanonymous
507199d9a8
Uni pc sampler now works with audio and video models.
2025-01-18 05:27:58 -05:00
comfyanonymous
2f3ab40b62
Add warning when using old pytorch versions.
2025-01-17 18:47:27 -05:00
patientx
5388be0f56
Merge branch 'comfyanonymous:master' into master
2025-01-17 09:41:17 +03:00
comfyanonymous
0aa2368e46
Fix some cosmos fp8 issues.
2025-01-16 17:45:37 -05:00
comfyanonymous
cca96a85ae
Fix cosmos VAE failing with videos longer than 121 frames.
2025-01-16 16:30:06 -05:00
patientx
4afa79a368
Merge branch 'comfyanonymous:master' into master
2025-01-16 17:23:00 +03:00
comfyanonymous
31831e6ef1
Code refactor.
2025-01-16 07:23:54 -05:00
comfyanonymous
88ceb28e20
Tweak hunyuan memory usage factor.
2025-01-16 06:31:03 -05:00
patientx
ed13b68e4f
Merge branch 'comfyanonymous:master' into master
2025-01-16 13:59:32 +03:00
comfyanonymous
23289a6a5c
Clean up some debug lines.
2025-01-16 04:24:39 -05:00
comfyanonymous
9d8b6c1f46
More accurate memory estimation for cosmos and hunyuan video.
2025-01-16 03:48:40 -05:00
patientx
a779e34c5b
Merge branch 'comfyanonymous:master' into master
2025-01-16 11:29:26 +03:00
comfyanonymous
6320d05696
Slightly lower hunyuan video memory usage.
2025-01-16 00:23:01 -05:00
comfyanonymous
25683b5b02
Lower cosmos diffusion model memory usage.
2025-01-15 23:46:42 -05:00
comfyanonymous
4758fb64b9
Lower cosmos VAE memory usage by a bit.
2025-01-15 22:57:52 -05:00
comfyanonymous
008761166f
Optimize first attention block in cosmos VAE.
2025-01-15 21:48:46 -05:00
patientx
fb0d92b160
Merge branch 'comfyanonymous:master' into master
2025-01-15 23:33:05 +03:00
comfyanonymous
cba58fff0b
Remove unsafe embedding load for very old pytorch.
2025-01-15 04:32:23 -05:00
patientx
3ece3091d4
Merge branch 'comfyanonymous:master' into master
2025-01-15 11:55:19 +03:00
comfyanonymous
2feb8d0b77
Force safe loading of files in torch format on pytorch 2.4+
...
If this breaks something for you make an issue.
2025-01-15 03:50:27 -05:00
patientx
104e6a9685
Merge branch 'comfyanonymous:master' into master
2025-01-15 03:59:33 +03:00
Pam
c78a45685d
Rewrite res_multistep sampler and implement res_multistep_cfg_pp sampler. ( #6462 )
2025-01-14 18:20:06 -05:00
patientx
4f01f72bed
Update zluda.py
2025-01-14 20:03:55 +03:00
patientx
c4861c74d4
UPDATED ZLUDA PATCHING METHOD
2025-01-14 19:57:22 +03:00
patientx
c3fc894ce2
Add files via upload
2025-01-14 19:54:44 +03:00
patientx
c7ebd121d6
Merge branch 'comfyanonymous:master' into master
2025-01-14 15:50:05 +03:00
comfyanonymous
3aaabb12d4
Implement Cosmos Image/Video to World (Video) diffusion models.
...
Use CosmosImageToVideoLatent to set the input image/video.
2025-01-14 05:14:10 -05:00
patientx
aa2a83ec35
Merge branch 'comfyanonymous:master' into master
2025-01-13 14:46:19 +03:00
comfyanonymous
1f1c7b7b56
Remove useless code.
2025-01-13 03:52:37 -05:00
patientx
b03621d13b
Merge branch 'comfyanonymous:master' into master
2025-01-12 14:31:19 +03:00
comfyanonymous
90f349f93d
Add res_multistep sampler from the cosmos code.
...
This sampler should work with all models.
2025-01-12 03:10:07 -05:00
patientx
d319705d78
Merge branch 'comfyanonymous:master' into master
2025-01-12 00:39:02 +03:00
Jedrzej Kosinski
6c9bd11fa3
Hooks Part 2 - TransformerOptionsHook and AdditionalModelsHook ( #6377 )
...
* Add 'sigmas' to transformer_options so that downstream code can know about the full scope of current sampling run, fix Hook Keyframes' guarantee_steps=1 inconsistent behavior with sampling split across different Sampling nodes/sampling runs by referencing 'sigmas'
* Cleaned up hooks.py, refactored Hook.should_register and add_hook_patches to use target_dict instead of target so that more information can be provided about the current execution environment if needed
* Refactor WrapperHook into TransformerOptionsHook, as there is no need to separate out Wrappers/Callbacks/Patches into different hook types (all affect transformer_options)
* Refactored HookGroup to also store a dictionary of hooks separated by hook_type, modified necessary code to no longer need to manually separate out hooks by hook_type
* In inner_sample, change "sigmas" to "sampler_sigmas" in transformer_options to not conflict with the "sigmas" that will overwrite "sigmas" in _calc_cond_batch
* Refactored 'registered' to be HookGroup instead of a list of Hooks, made AddModelsHook operational and compliant with should_register result, moved TransformerOptionsHook handling out of ModelPatcher.register_all_hook_patches, support patches in TransformerOptionsHook properly by casting any patches/wrappers/hooks to proper device at sample time
* Made hook clone code sane, made clear ObjectPatchHook and SetInjectionsHook are not yet operational
* Fix performance of hooks when hooks are appended via Cond Pair Set Props nodes by properly caching between positive and negative conds, make hook_patches_backup behave as intended (in the case that something pre-registers WeightHooks on the ModelPatcher instead of registering it at sample time)
* Filter only registered hooks on self.conds in CFGGuider.sample
* Make hook_scope functional for TransformerOptionsHook
* removed 4 whitespace lines to satisfy Ruff,
* Add a get_injections function to ModelPatcher
* Made TransformerOptionsHook contribute to registered hooks properly, added some doc strings and removed a so-far unused variable
* Rename AddModelsHooks to AdditionalModelsHook, rename SetInjectionsHook to InjectionsHook (not yet implemented, but at least getting the naming figured out)
* Clean up a typehint
2025-01-11 12:20:23 -05:00
patientx
8ac79a7563
Merge branch 'comfyanonymous:master' into master
2025-01-11 15:04:10 +03:00
comfyanonymous
ee8a7ab69d
Fast latent preview for Cosmos.
2025-01-11 04:41:24 -05:00
patientx
7d45042e2e
Merge branch 'comfyanonymous:master' into master
2025-01-10 18:31:48 +03:00
comfyanonymous
2ff3104f70
WIP support for Nvidia Cosmos 7B and 14B text to world (video) models.
2025-01-10 09:14:16 -05:00
patientx
00cf1206e8
Merge branch 'comfyanonymous:master' into master
2025-01-10 15:44:49 +03:00
comfyanonymous
129d8908f7
Add argument to skip the output reshaping in the attention functions.
2025-01-10 06:27:37 -05:00
patientx
c37b5ccf29
Merge branch 'comfyanonymous:master' into master
2025-01-09 18:15:43 +03:00
comfyanonymous
ff838657fa
Cleaner handling of attention mask in ltxv model code.
2025-01-09 07:12:03 -05:00
patientx
0b933e1634
Merge branch 'comfyanonymous:master' into master
2025-01-09 12:57:31 +03:00
comfyanonymous
2307ff6746
Improve some logging messages.
2025-01-08 19:05:22 -05:00
patientx
867659b035
Merge branch 'comfyanonymous:master' into master
2025-01-08 01:49:22 +03:00
comfyanonymous
d0f3752e33
Properly calculate inner dim for t5 model.
...
This is required to support some different types of t5 models.
2025-01-07 17:33:03 -05:00
patientx
e1aa83d068
Merge branch 'comfyanonymous:master' into master
2025-01-07 13:57:13 +03:00
comfyanonymous
4209edf48d
Make a few more samplers deterministic.
2025-01-07 02:12:32 -05:00
patientx
58cfa6b7f3
Merge branch 'comfyanonymous:master' into master
2025-01-07 09:25:41 +03:00
Chenlei Hu
d055325783
Document get_attr and get_model_object ( #6357 )
...
* Document get_attr and get_model_object
* Update model_patcher.py
* Update model_patcher.py
* Update model_patcher.py
2025-01-06 20:12:22 -05:00
patientx
6c1e37c091
Merge branch 'comfyanonymous:master' into master
2025-01-06 14:50:48 +03:00
comfyanonymous
916d1e14a9
Make ancestral samplers more deterministic.
2025-01-06 03:04:32 -05:00
Jedrzej Kosinski
c496e53519
In inner_sample, change "sigmas" to "sampler_sigmas" in transformer_options to not conflict with the "sigmas" that will overwrite "sigmas" in _calc_cond_batch ( #6360 )
2025-01-06 01:36:47 -05:00
patientx
a5a09e45dd
Merge branch 'comfyanonymous:master' into master
2025-01-04 15:42:12 +03:00
comfyanonymous
d45ebb63f6
Remove old unused function.
2025-01-04 07:20:54 -05:00
patientx
c523c36aef
Merge branch 'comfyanonymous:master' into master
2025-01-02 19:55:49 +03:00
comfyanonymous
9e9c8a1c64
Clear cache as often on AMD as Nvidia.
...
I think the issue this was working around has been solved.
If you notice that this change slows things down or causes stutters on
your AMD GPU with ROCm on Linux please report it.
2025-01-02 08:44:16 -05:00
patientx
bfc4fb0efb
Merge branch 'comfyanonymous:master' into master
2025-01-02 00:37:45 +03:00
Andrew Kvochko
0f11d60afb
Fix temporal tiling for decoder, remove redundant tiles. ( #6306 )
...
This commit fixes the temporal tile size calculation, and removes
a redundant tile at the end of the range when its elements are
completely covered by the previous tile.
Co-authored-by: Andrew Kvochko <a.kvochko@lightricks.com>
2025-01-01 16:29:01 -05:00
patientx
0dbf8238af
Merge branch 'comfyanonymous:master' into master
2025-01-01 15:55:00 +03:00
comfyanonymous
79eea51a1d
Fix and enforce all ruff W rules.
2025-01-01 03:08:33 -05:00
patientx
5c8d73f4b4
Merge branch 'comfyanonymous:master' into master
2025-01-01 02:14:54 +03:00