Commit Graph

1203 Commits

Author SHA1 Message Date
comfyanonymous
2c22729cec Start is now 0.0 and end is now 1.0 for the timestep ranges. 2023-07-24 18:38:17 -04:00
comfyanonymous
857a80b857 ControlNetApplyAdvanced can now define when controlnet gets applied. 2023-07-24 17:50:49 -04:00
comfyanonymous
0ed4637a7a Add a ControlNetApplyAdvanced node.
The controlnet can be applied to the positive or negative prompt only by
connecting it correctly.
2023-07-24 13:35:20 -04:00
comfyanonymous
156b047427 Add a way to set which range of timesteps the cond gets applied to. 2023-07-24 09:25:02 -04:00
comfyanonymous
34c90cf9ca Disable cuda malloc on all the 9xx series. 2023-07-23 13:29:14 -04:00
comfyanonymous
aa8fde7d6b Try to fix memory issue with lora. 2023-07-22 21:38:56 -04:00
comfyanonymous
4d83ca88a5 Nodes can now patch the unet function. 2023-07-22 17:01:12 -04:00
comfyanonymous
676b635914 Del the right object when applying lora. 2023-07-22 11:25:49 -04:00
comfyanonymous
320305b17b Disable cuda malloc on regular GTX 960. 2023-07-22 11:05:33 -04:00
comfyanonymous
bfb95012c7 Support controlnet in diffusers format. 2023-07-21 22:58:16 -04:00
comfyanonymous
c8a5d3363c Fix issue with lora in some cases when combined with model merging. 2023-07-21 21:27:27 -04:00
comfyanonymous
be012f4b28 Properly support SDXL diffusers unet with UNETLoader node. 2023-07-21 14:38:56 -04:00
comfyanonymous
e2c3256a19 Print errors and continue when lora weights are not compatible. 2023-07-20 19:56:22 -04:00
comfyanonymous
b2879e0168 Merge branch 'fix-AttributeError-module-'torch'-has-no-attribute-'mps'' of https://github.com/KarryCharon/ComfyUI 2023-07-20 00:34:54 -04:00
comfyanonymous
dd46c9e941 Move image encoding outside of sampling loop for better preview perf. 2023-07-19 18:06:58 -04:00
comfyanonymous
f5b46e87d5 Disable cuda malloc on GTX 750 Ti. 2023-07-19 15:14:10 -04:00
comfyanonymous
996acc190d Update how to get the prompt in api format in the example. 2023-07-19 15:07:12 -04:00
comfyanonymous
d1b7051401 Auto disable cuda malloc on some GPUs on windows. 2023-07-19 14:43:55 -04:00
comfyanonymous
c17cfd126d Fix typo. 2023-07-19 10:20:32 -04:00
comfyanonymous
d71acc2fe4 Fix ddim issue with older torch versions. 2023-07-19 10:16:00 -04:00
comfyanonymous
3aad28d483 Add MX450 and MX550 to list of cards with broken fp16. 2023-07-19 03:08:30 -04:00
comfyanonymous
4ec7f09adc It's actually possible to torch.compile the unet now. 2023-07-18 21:36:35 -04:00
comfyanonymous
cea5c2adfb Add key to indicate checkpoint is v_prediction when saving. 2023-07-18 00:25:53 -04:00
comfyanonymous
22abe3af9f Fix device print on old torch version. 2023-07-17 15:18:58 -04:00
comfyanonymous
5108099f0c Enable --cuda-malloc by default on torch 2.0 and up.
Add --disable-cuda-malloc to disable it.
2023-07-17 15:12:10 -04:00
comfyanonymous
7da7500fcc --windows-standalone-build now enables --cuda-malloc 2023-07-17 14:10:36 -04:00
comfyanonymous
5ddb2ca26f Add a command line argument to enable backend:cudaMallocAsync 2023-07-17 11:00:14 -04:00
comfyanonymous
f9688727f1 Only calculate randn in some samplers when it's actually being used. 2023-07-17 10:11:08 -04:00
comfyanonymous
244987af7e Fix regression with ddim and uni_pc when batch size > 1. 2023-07-17 09:35:19 -04:00
comfyanonymous
debccdc6f9 Refactor of sampler code to deal more easily with different model types. 2023-07-17 01:22:12 -04:00
comfyanonymous
8a2021b23c Merge branch 'master' of https://github.com/ComfyUI-Community/ComfyUI 2023-07-16 03:04:45 -04:00
comfyanonymous
ba6e888eb9 Lower lora ram usage when in normal vram mode. 2023-07-16 02:59:04 -04:00
ComfyUI-Community
d9280c8353 Patch del self.loaded_lora to prevent error with persistent lora_name swapping 2023-07-15 17:11:12 -07:00
comfyanonymous
73c2afbe44 Speed up lora loading a bit. 2023-07-15 13:25:22 -04:00
comfyanonymous
f67f1c99b8 Fix CLIPSetLastLayer not reverting when removed. 2023-07-15 01:41:21 -04:00
comfyanonymous
26da73b600 Reduce floating point rounding errors in loras. 2023-07-15 00:53:00 -04:00
comfyanonymous
daac253452 Add a node to merge CLIP models. 2023-07-14 02:41:18 -04:00
comfyanonymous
ea8feef004 Refactor to make it easier to set the api path. 2023-07-14 00:50:49 -04:00
comfyanonymous
a4a16c6bbd Merge branch 'use-relative-paths' of https://github.com/mcmonkey4eva/ComfyUI 2023-07-13 23:56:29 -04:00
comfyanonymous
8ee8947887 Move conditioning concat node to conditioning section. 2023-07-13 21:44:56 -04:00
comfyanonymous
dbc9ff3ffa Enables a way to save workflows in api format in frontend.
Enable the dev mode in the settings to see it.
2023-07-13 21:08:54 -04:00
comfyanonymous
a85b525a8a Add a canny preprocessor node. 2023-07-13 13:26:48 -04:00
comfyanonymous
8fbbd78d3b Print prestartup times for custom nodes. 2023-07-13 13:01:45 -04:00
comfyanonymous
4c83bb4110 Don't let custom nodes overwrite base nodes. 2023-07-13 12:56:38 -04:00
comfyanonymous
726fe07028 Highlight nodes with errors in red even when workflow works fine. 2023-07-13 10:07:50 -04:00
comfyanonymous
f4c18db4ed Prevent the clip_g position_ids key from being saved in the checkpoint.
This is to make it match the official checkpoint.
2023-07-12 20:15:02 -04:00
comfyanonymous
28bf6d49da Fix potential tensors being on different devices issues. 2023-07-12 19:29:27 -04:00
comfyanonymous
2eb19cbe2e Add back roundRect to fix issue on firefox ESR. 2023-07-12 02:07:48 -04:00
KarryCharon
3ee78c064b fix mps miss import 2023-07-12 10:06:34 +08:00
comfyanonymous
8784b492ed Add a random string to the temp prefix for PreviewImage. 2023-07-11 17:35:55 -04:00