Commit Graph

185 Commits

Author SHA1 Message Date
doctorpangloss
b241ecc56d Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-21 11:38:24 -07:00
comfyanonymous
83d969e397 Disable xformers when tracing model. 2024-05-21 13:55:49 -04:00
doctorpangloss
f69b6225c0 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-20 12:06:35 -07:00
comfyanonymous
1900e5119f Fix potential issue. 2024-05-20 08:19:54 -04:00
comfyanonymous
0bdc2b15c7 Cleanup. 2024-05-18 10:11:44 -04:00
comfyanonymous
98f828fad9 Remove unnecessary code. 2024-05-18 09:36:44 -04:00
doctorpangloss
3d98440fb7 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-16 14:28:49 -07:00
comfyanonymous
46daf0a9a7 Add debug options to force on and off attention upcasting. 2024-05-16 04:09:41 -04:00
comfyanonymous
ec6f16adb6 Fix SAG. 2024-05-14 18:02:27 -04:00
comfyanonymous
bb4940d837 Only enable attention upcasting on models that actually need it. 2024-05-14 17:00:50 -04:00
comfyanonymous
b0ab31d06c Refactor attention upcasting code part 1. 2024-05-14 12:47:31 -04:00
doctorpangloss
330ecb10b2 Merge with upstream. Remove TLS flags, because a third party proxy will do this better 2024-05-02 21:57:20 -07:00
comfyanonymous
2aed53c4ac Workaround xformers bug. 2024-04-30 21:23:40 -04:00
doctorpangloss
005e370254 Merge upstream 2024-03-21 13:15:36 -07:00
comfyanonymous
d7897fff2c Move cascade scale factor from stage_a to latent_formats.py 2024-03-16 14:49:35 -04:00
doctorpangloss
93cdef65a4 Merge upstream 2024-03-12 09:49:47 -07:00
comfyanonymous
2a813c3b09 Switch some more prints to logging. 2024-03-11 16:34:58 -04:00
doctorpangloss
c0d9bc0129 Merge with upstream 2024-03-08 15:17:20 -08:00
comfyanonymous
5f60ee246e Support loading the sr cascade controlnet. 2024-03-07 01:22:48 -05:00
comfyanonymous
03e6e81629 Set upscale algorithm to bilinear for stable cascade controlnet. 2024-03-06 02:59:40 -05:00
comfyanonymous
03e83bb5d0 Support stable cascade canny controlnet. 2024-03-06 02:25:42 -05:00
doctorpangloss
915f2da874 Merge upstream 2024-02-29 20:48:27 -08:00
comfyanonymous
cb7c3a2921 Allow image_only_indicator to be None. 2024-02-29 13:11:30 -05:00
comfyanonymous
b3e97fc714 Koala 700M and 1B support.
Use the UNET Loader node to load the unet file to use them.
2024-02-28 12:10:11 -05:00
doctorpangloss
5cd06a727f Fix relative imports 2024-02-21 13:31:55 -08:00
doctorpangloss
f419c5760d Adding __init__.py 2024-02-20 14:24:51 -08:00
doctorpangloss
7520691021 Merge with master 2024-02-19 10:55:22 -08:00
comfyanonymous
e93cdd0ad0 Remove print. 2024-02-19 11:47:26 -05:00
comfyanonymous
a7b5eaa7e3 Forgot to commit this. 2024-02-19 04:25:46 -05:00
comfyanonymous
6bcf57ff10 Fix attention masks properly for multiple batches. 2024-02-17 16:15:18 -05:00
comfyanonymous
11e3221f1f fp8 weight support for Stable Cascade. 2024-02-17 15:27:31 -05:00
comfyanonymous
f8706546f3 Fix attention mask batch size in some attention functions. 2024-02-17 15:22:21 -05:00
comfyanonymous
3b9969c1c5 Properly fix attention masks in CLIP with batches. 2024-02-17 12:13:13 -05:00
comfyanonymous
805c36ac9c Make Stable Cascade work on old pytorch 2.0 2024-02-17 00:42:30 -05:00
comfyanonymous
667c92814e Stable Cascade Stage B. 2024-02-16 13:02:03 -05:00
comfyanonymous
f83109f09b Stable Cascade Stage C. 2024-02-16 10:55:08 -05:00
comfyanonymous
5e06baf112 Stable Cascade Stage A. 2024-02-16 06:30:39 -05:00
doctorpangloss
54d419d855 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-02-08 20:31:05 -08:00
Benjamin Berman
b8fc850b47 Correctly preserves your installed torch when installed like pip install --no-build-isolation git+https://github.com/hiddenswitch/ComfyUI.git 2024-02-08 08:36:05 -08:00
comfyanonymous
c661a8b118 Don't use numpy for calculating sigmas. 2024-02-07 18:52:51 -05:00
doctorpangloss
82edb2ff0e Merge with latest upstream. 2024-01-29 15:06:31 -08:00
comfyanonymous
89507f8adf Remove some unused imports. 2024-01-25 23:42:37 -05:00
comfyanonymous
2395ae740a Make unclip more deterministic.
Pass a seed argument note that this might make old unclip images different.
2024-01-14 17:28:31 -05:00
comfyanonymous
6a7bc35db8 Use basic attention implementation for small inputs on old pytorch. 2024-01-09 13:46:52 -05:00
comfyanonymous
c6951548cf Update optimized_attention_for_device function for new functions that
support masked attention.
2024-01-07 13:52:08 -05:00
comfyanonymous
aaa9017302 Add attention mask support to sub quad attention. 2024-01-07 04:13:58 -05:00
comfyanonymous
0c2c9fbdfa Support attention mask in split attention. 2024-01-06 13:16:48 -05:00
comfyanonymous
3ad0191bfb Implement attention mask on xformers. 2024-01-06 04:33:03 -05:00
doctorpangloss
369aeb598f Merge upstream, fix 3.12 compatibility, fix nightlies issue, fix broken node 2024-01-03 16:00:36 -08:00
comfyanonymous
8c6493578b Implement noise augmentation for SD 4X upscale model. 2024-01-03 14:27:11 -05:00