Commit Graph

411 Commits

Author SHA1 Message Date
Simon Lui
af8959c8a9 Add ipex optimize and other enhancements for Intel GPUs based on recent memory changes. 2023-08-20 14:19:51 -04:00
comfyanonymous
56901bd7c6 --disable-smart-memory now disables loading model directly to vram. 2023-08-20 04:00:53 -04:00
comfyanonymous
225a5f9f1f Free more memory before VAE encode/decode. 2023-08-19 12:13:13 -04:00
comfyanonymous
01a6f9b116 Fix issue with gligen. 2023-08-18 16:32:23 -04:00
comfyanonymous
280659a6ee Support for Control Loras.
Control loras are controlnets where some of the weights are stored in
"lora" format: an up and a down low rank matrice that when multiplied
together and added to the unet weight give the controlnet weight.

This allows a much smaller memory footprint depending on the rank of the
matrices.

These controlnets are used just like regular ones.
2023-08-18 11:59:51 -04:00
comfyanonymous
398390a76f ReVision support: unclip nodes can now be used with SDXL. 2023-08-18 11:59:36 -04:00
comfyanonymous
e246c23708 Add support for clip g vision model to CLIPVisionLoader. 2023-08-18 11:13:29 -04:00
Alexopus
a5a8d25943 Fix referenced before assignment
For https://github.com/BlenderNeko/ComfyUI_TiledKSampler/issues/13
2023-08-17 22:30:07 +02:00
comfyanonymous
bcf55c1446 Fix issue with not freeing enough memory when sampling. 2023-08-17 15:59:56 -04:00
comfyanonymous
d8f9334347 Fix bug with lowvram and controlnet advanced node. 2023-08-17 13:38:51 -04:00
comfyanonymous
e075077ad8 Fix potential issues with patching models when saving checkpoints. 2023-08-17 11:07:08 -04:00
comfyanonymous
21e07337ed Add --disable-smart-memory for those that want the old behaviour. 2023-08-17 03:12:37 -04:00
comfyanonymous
197ab43811 Fix issue with regular torch version. 2023-08-17 01:58:54 -04:00
comfyanonymous
a216b56591 Smarter memory management.
Try to keep models on the vram when possible.

Better lowvram mode for controlnets.
2023-08-17 01:06:34 -04:00
comfyanonymous
e4ffcf2c61 Support small diffusers controlnet so both types are now supported. 2023-08-16 12:45:56 -04:00
comfyanonymous
1a21a2271e Support diffusers mini controlnets. 2023-08-16 12:28:01 -04:00
comfyanonymous
9e0c084148 Fix clip vision issue with old transformers versions. 2023-08-16 11:36:22 -04:00
comfyanonymous
dcbf839d22 Fix potential issue with batch size and clip vision. 2023-08-16 11:05:11 -04:00
comfyanonymous
601e4a9865 Refactor unclip code. 2023-08-14 23:48:47 -04:00
comfyanonymous
736e2e8f49 CLIPVisionEncode can now encode multiple images. 2023-08-14 16:54:05 -04:00
comfyanonymous
87f1037cf5 Remove 3m from PR #1213 because of some small issues. 2023-08-14 00:48:45 -04:00
comfyanonymous
c3910c4ffb Add sgm_uniform scheduler that acts like the default one in sgm. 2023-08-14 00:29:03 -04:00
comfyanonymous
c5666c503b Gpu variant of dpmpp_3m_sde. Note: use 3m with exponential or karras. 2023-08-14 00:28:50 -04:00
comfyanonymous
fdff76f667 Merge branch 'dpmpp3m' of https://github.com/FizzleDorf/ComfyUI 2023-08-14 00:23:15 -04:00
FizzleDorf
8b4773ee86 dpmpp 3m + dpmpp 3m sde added 2023-08-13 22:29:04 -04:00
comfyanonymous
c3df7d2861 Print unet config when model isn't detected. 2023-08-13 01:39:48 -04:00
comfyanonymous
a2d5028ad8 Support for yet another lora type based on diffusers. 2023-08-11 13:04:21 -04:00
comfyanonymous
64510bb4c2 Add --temp-directory argument to set temp directory. 2023-08-11 05:13:03 -04:00
comfyanonymous
c5a20b5c85 Support diffuser text encoder loras. 2023-08-10 20:28:28 -04:00
comfyanonymous
1b69a7e7ea Disable calculating uncond when CFG is 1.0 2023-08-09 20:55:03 -04:00
comfyanonymous
84c0dd8247 Add argument to disable auto launching the browser. 2023-08-07 02:25:12 -04:00
comfyanonymous
531ea9c1aa Detect hint_channels from controlnet. 2023-08-06 14:08:59 -04:00
comfyanonymous
18c180b727 Support loras in diffusers format. 2023-08-05 01:40:24 -04:00
comfyanonymous
ef16077917 Add CMP 30HX card to the nvidia_16_series list. 2023-08-04 12:08:45 -04:00
comfyanonymous
bbd5052ed0 Make sure the pooled output stays at the EOS token with added embeddings. 2023-08-03 20:27:50 -04:00
comfyanonymous
28401d83c5 Only shift text encoder to vram when CPU cores are under 8. 2023-07-31 00:08:54 -04:00
comfyanonymous
2ee42215be Lower CPU thread check for running the text encoder on the CPU vs GPU. 2023-07-30 17:18:24 -04:00
comfyanonymous
9c5ad64310 Remove some useless code. 2023-07-30 14:13:33 -04:00
comfyanonymous
71ffc7b350 Faster VAE loading. 2023-07-29 16:28:30 -04:00
comfyanonymous
8a8d8c86d6 Initialize the unet directly on the target device. 2023-07-29 14:51:56 -04:00
comfyanonymous
6651febc61 Remove unused code and torchdiffeq dependency. 2023-07-28 21:32:27 -04:00
comfyanonymous
8088be4f2e Add --disable-metadata argument to disable saving metadata in files. 2023-07-28 12:31:41 -04:00
comfyanonymous
7ffe966c0a Merge branch 'fix_batch_timesteps' of https://github.com/asagi4/ComfyUI 2023-07-27 16:13:48 -04:00
comfyanonymous
cb47a5674c Remove some prints. 2023-07-27 16:12:43 -04:00
asagi4
a63268b4e9 Fix timestep ranges when batch_size > 1 2023-07-27 21:14:09 +03:00
comfyanonymous
f9001baa44 Fix diffusers VAE loading. 2023-07-26 18:26:39 -04:00
comfyanonymous
5d64d20ef5 Fix some new loras. 2023-07-25 16:39:15 -04:00
comfyanonymous
38f847a81d Fix potential issue with Save Checkpoint. 2023-07-25 00:45:20 -04:00
comfyanonymous
2c22729cec Start is now 0.0 and end is now 1.0 for the timestep ranges. 2023-07-24 18:38:17 -04:00
comfyanonymous
857a80b857 ControlNetApplyAdvanced can now define when controlnet gets applied. 2023-07-24 17:50:49 -04:00