Benjamin Berman
b6d3f1fb08
Accept workflows from the command line
2025-05-07 14:53:39 -07:00
doctorpangloss
5823497d55
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2025-04-21 13:14:36 -07:00
Raphael Walker
89e4ea0175
Add activations_shape info in UNet models ( #7482 )
...
* Add activations_shape info in UNet models
* activations_shape should be a list
2025-04-04 21:27:54 -04:00
doctorpangloss
ffc1912eff
Fix issues with tests
2025-04-04 08:27:33 -07:00
doctorpangloss
040a324346
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2025-03-29 15:57:24 -07:00
comfyanonymous
e471c726e5
Fallback to pytorch attention if sage attention fails.
2025-03-22 15:45:56 -04:00
FeepingCreature
9c98c6358b
Tolerate missing @torch.library.custom_op ( #7234 )
...
This can happen on Pytorch versions older than 2.4.
2025-03-14 09:51:26 -04:00
FeepingCreature
7aceb9f91c
Add --use-flash-attention flag. ( #7223 )
...
* Add --use-flash-attention flag.
This is useful on AMD systems, as FA builds are still 10% faster than Pytorch cross-attention.
2025-03-14 03:22:41 -04:00
doctorpangloss
693038738a
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2025-02-24 09:39:26 -08:00
comfyanonymous
96d891cb94
Speedup on some models by not upcasting bfloat16 to float32 on mac.
2025-02-24 05:41:32 -05:00
comfyanonymous
aff16532d4
Remove some useless code.
2025-02-22 04:45:14 -05:00
Dr.Lt.Data
0a0df5f136
better guide message for sageattention ( #6634 )
2025-02-02 09:26:47 -05:00
doctorpangloss
631d9e44c6
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2025-01-16 09:58:02 -08:00
comfyanonymous
129d8908f7
Add argument to skip the output reshaping in the attention functions.
2025-01-10 06:27:37 -05:00
doctorpangloss
7655be873c
Updates to support Hunyuan Video
2024-12-25 22:39:12 -08:00
doctorpangloss
0fd407ae87
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-12-24 16:48:03 -08:00
comfyanonymous
37e5390f5f
Add: --use-sage-attention to enable SageAttention.
...
You need to have the library installed first.
2024-12-18 01:56:10 -05:00
comfyanonymous
19ee5d9d8b
Don't expand mask when not necessary.
...
Expanding seems to slow down inference.
2024-12-16 18:22:50 -05:00
Raphael Walker
61b50720d0
Add support for attention masking in Flux ( #5942 )
...
* fix attention OOM in xformers
* allow passing attention mask in flux attention
* allow an attn_mask in flux
* attn masks can be done using replace patches instead of a separate dict
* fix return types
* fix return order
* enumerate
* patch the right keys
* arg names
* fix a silly bug
* fix xformers masks
* replace match with if, elif, else
* mask with image_ref_size
* remove unused import
* remove unused import 2
* fix pytorch/xformers attention
This corrects a weird inconsistency with skip_reshape.
It also allows masks of various shapes to be passed, which will be
automtically expanded (in a memory-efficient way) to a size that is
compatible with xformers or pytorch sdpa respectively.
* fix mask shapes
2024-12-16 18:21:17 -05:00
Chenlei Hu
d9d7f3c619
Lint all unused variables ( #5989 )
...
* Enable F841
* Autofix
* Remove all unused variable assignment
2024-12-12 17:59:16 -05:00
doctorpangloss
f39b8dfebc
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-11-22 15:50:23 -08:00
comfyanonymous
2fd9c1308a
Fix mask issue in some attention functions.
2024-11-22 02:10:09 -05:00
comfyanonymous
07f6eeaa13
Fix mask issue with attention_xformers.
2024-11-20 17:07:46 -05:00
doctorpangloss
772e768fe8
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-11-04 10:17:26 -08:00
comfyanonymous
fabf449feb
Mochi VAE encoder.
2024-11-01 17:33:09 -04:00
doctorpangloss
a8d8bff548
Improve support for torch compilation and sage attention
2024-10-29 19:22:26 -07:00
doctorpangloss
99f0fa8b50
Enable sage attention autodetection
2024-10-09 09:27:05 -07:00
doctorpangloss
388dad67d5
Fix pylint errors in attention
2024-10-09 09:26:02 -07:00
doctorpangloss
bbe2ed330c
Memory management and compilation improvements
...
- Experimental support for sage attention on Linux
- Diffusers loader now supports model indices
- Transformers model management now aligns with updates to ComfyUI
- Flux layers correctly use unbind
- Add float8 support for model loading in more places
- Experimental quantization approaches from Quanto and torchao
- Model upscaling interacts with memory management better
This update also disables ROCm testing because it isn't reliable enough
on consumer hardware. ROCm is not really supported by the 7600.
2024-10-09 09:13:47 -07:00
doctorpangloss
8284ea2fca
WIP merge
2024-08-16 14:25:06 -07:00
comfyanonymous
33fb282d5c
Fix issue.
2024-08-14 02:51:47 -04:00
doctorpangloss
0549f35e85
Merge commit '39fb74c5bd13a1dccf4d7293a2f7a755d9f43cbd' of github.com:comfyanonymous/ComfyUI
...
- Improvements to tests
- Fixes model management
- Fixes issues with language nodes
2024-08-13 20:08:56 -07:00
doctorpangloss
8cdc246450
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-06-17 16:19:48 -07:00
comfyanonymous
bb1969cab7
Initial support for the stable audio open model.
2024-06-15 12:14:56 -04:00
Max Tretikov
8b091f02de
Add xformer.ops imports
2024-06-14 14:09:46 -06:00
Max Tretikov
6c53388619
Fix xformers import statements in comfy.ldm.modules.attention
2024-06-14 11:21:08 -06:00
doctorpangloss
cb557c960b
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-05-31 07:42:11 -07:00
comfyanonymous
0920e0e5fe
Remove some unused imports.
2024-05-27 19:08:27 -04:00
comfyanonymous
8508df2569
Work around black image bug on Mac 14.5 by forcing attention upcasting.
2024-05-21 16:56:33 -04:00
doctorpangloss
b241ecc56d
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-05-21 11:38:24 -07:00
comfyanonymous
83d969e397
Disable xformers when tracing model.
2024-05-21 13:55:49 -04:00
doctorpangloss
f69b6225c0
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-05-20 12:06:35 -07:00
comfyanonymous
1900e5119f
Fix potential issue.
2024-05-20 08:19:54 -04:00
comfyanonymous
0bdc2b15c7
Cleanup.
2024-05-18 10:11:44 -04:00
comfyanonymous
98f828fad9
Remove unnecessary code.
2024-05-18 09:36:44 -04:00
doctorpangloss
3d98440fb7
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-05-16 14:28:49 -07:00
comfyanonymous
46daf0a9a7
Add debug options to force on and off attention upcasting.
2024-05-16 04:09:41 -04:00
comfyanonymous
ec6f16adb6
Fix SAG.
2024-05-14 18:02:27 -04:00
comfyanonymous
bb4940d837
Only enable attention upcasting on models that actually need it.
2024-05-14 17:00:50 -04:00
comfyanonymous
b0ab31d06c
Refactor attention upcasting code part 1.
2024-05-14 12:47:31 -04:00