comfyanonymous
|
8088be4f2e
|
Add --disable-metadata argument to disable saving metadata in files.
|
2023-07-28 12:31:41 -04:00 |
|
comfyanonymous
|
7ffe966c0a
|
Merge branch 'fix_batch_timesteps' of https://github.com/asagi4/ComfyUI
|
2023-07-27 16:13:48 -04:00 |
|
comfyanonymous
|
cb47a5674c
|
Remove some prints.
|
2023-07-27 16:12:43 -04:00 |
|
asagi4
|
a63268b4e9
|
Fix timestep ranges when batch_size > 1
|
2023-07-27 21:14:09 +03:00 |
|
comfyanonymous
|
f9001baa44
|
Fix diffusers VAE loading.
|
2023-07-26 18:26:39 -04:00 |
|
comfyanonymous
|
5d64d20ef5
|
Fix some new loras.
|
2023-07-25 16:39:15 -04:00 |
|
comfyanonymous
|
38f847a81d
|
Fix potential issue with Save Checkpoint.
|
2023-07-25 00:45:20 -04:00 |
|
comfyanonymous
|
2c22729cec
|
Start is now 0.0 and end is now 1.0 for the timestep ranges.
|
2023-07-24 18:38:17 -04:00 |
|
comfyanonymous
|
857a80b857
|
ControlNetApplyAdvanced can now define when controlnet gets applied.
|
2023-07-24 17:50:49 -04:00 |
|
comfyanonymous
|
0ed4637a7a
|
Add a ControlNetApplyAdvanced node.
The controlnet can be applied to the positive or negative prompt only by
connecting it correctly.
|
2023-07-24 13:35:20 -04:00 |
|
comfyanonymous
|
156b047427
|
Add a way to set which range of timesteps the cond gets applied to.
|
2023-07-24 09:25:02 -04:00 |
|
comfyanonymous
|
aa8fde7d6b
|
Try to fix memory issue with lora.
|
2023-07-22 21:38:56 -04:00 |
|
comfyanonymous
|
4d83ca88a5
|
Nodes can now patch the unet function.
|
2023-07-22 17:01:12 -04:00 |
|
comfyanonymous
|
676b635914
|
Del the right object when applying lora.
|
2023-07-22 11:25:49 -04:00 |
|
comfyanonymous
|
bfb95012c7
|
Support controlnet in diffusers format.
|
2023-07-21 22:58:16 -04:00 |
|
comfyanonymous
|
c8a5d3363c
|
Fix issue with lora in some cases when combined with model merging.
|
2023-07-21 21:27:27 -04:00 |
|
comfyanonymous
|
be012f4b28
|
Properly support SDXL diffusers unet with UNETLoader node.
|
2023-07-21 14:38:56 -04:00 |
|
comfyanonymous
|
e2c3256a19
|
Print errors and continue when lora weights are not compatible.
|
2023-07-20 19:56:22 -04:00 |
|
comfyanonymous
|
b2879e0168
|
Merge branch 'fix-AttributeError-module-'torch'-has-no-attribute-'mps'' of https://github.com/KarryCharon/ComfyUI
|
2023-07-20 00:34:54 -04:00 |
|
comfyanonymous
|
c17cfd126d
|
Fix typo.
|
2023-07-19 10:20:32 -04:00 |
|
comfyanonymous
|
d71acc2fe4
|
Fix ddim issue with older torch versions.
|
2023-07-19 10:16:00 -04:00 |
|
comfyanonymous
|
3aad28d483
|
Add MX450 and MX550 to list of cards with broken fp16.
|
2023-07-19 03:08:30 -04:00 |
|
comfyanonymous
|
4ec7f09adc
|
It's actually possible to torch.compile the unet now.
|
2023-07-18 21:36:35 -04:00 |
|
comfyanonymous
|
cea5c2adfb
|
Add key to indicate checkpoint is v_prediction when saving.
|
2023-07-18 00:25:53 -04:00 |
|
comfyanonymous
|
22abe3af9f
|
Fix device print on old torch version.
|
2023-07-17 15:18:58 -04:00 |
|
comfyanonymous
|
5108099f0c
|
Enable --cuda-malloc by default on torch 2.0 and up.
Add --disable-cuda-malloc to disable it.
|
2023-07-17 15:12:10 -04:00 |
|
comfyanonymous
|
7da7500fcc
|
--windows-standalone-build now enables --cuda-malloc
|
2023-07-17 14:10:36 -04:00 |
|
comfyanonymous
|
5ddb2ca26f
|
Add a command line argument to enable backend:cudaMallocAsync
|
2023-07-17 11:00:14 -04:00 |
|
comfyanonymous
|
f9688727f1
|
Only calculate randn in some samplers when it's actually being used.
|
2023-07-17 10:11:08 -04:00 |
|
comfyanonymous
|
244987af7e
|
Fix regression with ddim and uni_pc when batch size > 1.
|
2023-07-17 09:35:19 -04:00 |
|
comfyanonymous
|
debccdc6f9
|
Refactor of sampler code to deal more easily with different model types.
|
2023-07-17 01:22:12 -04:00 |
|
comfyanonymous
|
ba6e888eb9
|
Lower lora ram usage when in normal vram mode.
|
2023-07-16 02:59:04 -04:00 |
|
comfyanonymous
|
73c2afbe44
|
Speed up lora loading a bit.
|
2023-07-15 13:25:22 -04:00 |
|
comfyanonymous
|
f67f1c99b8
|
Fix CLIPSetLastLayer not reverting when removed.
|
2023-07-15 01:41:21 -04:00 |
|
comfyanonymous
|
26da73b600
|
Reduce floating point rounding errors in loras.
|
2023-07-15 00:53:00 -04:00 |
|
comfyanonymous
|
daac253452
|
Add a node to merge CLIP models.
|
2023-07-14 02:41:18 -04:00 |
|
comfyanonymous
|
f4c18db4ed
|
Prevent the clip_g position_ids key from being saved in the checkpoint.
This is to make it match the official checkpoint.
|
2023-07-12 20:15:02 -04:00 |
|
comfyanonymous
|
28bf6d49da
|
Fix potential tensors being on different devices issues.
|
2023-07-12 19:29:27 -04:00 |
|
KarryCharon
|
3ee78c064b
|
fix mps miss import
|
2023-07-12 10:06:34 +08:00 |
|
comfyanonymous
|
6e99974161
|
Support SDXL embedding format with 2 CLIP.
|
2023-07-10 10:34:59 -04:00 |
|
comfyanonymous
|
c1c170f61a
|
Don't patch weights when multiplier is zero.
|
2023-07-09 17:46:56 -04:00 |
|
comfyanonymous
|
55d00ccefd
|
latent2rgb matrix for SDXL.
|
2023-07-09 13:59:09 -04:00 |
|
comfyanonymous
|
42805fd416
|
Empty cache after model unloading for normal vram and lower.
|
2023-07-09 09:56:03 -04:00 |
|
comfyanonymous
|
7d69d770e1
|
Support loading clip_g from diffusers in CLIP Loader nodes.
|
2023-07-09 09:33:53 -04:00 |
|
comfyanonymous
|
c5779f04aa
|
Fix merging not working when model2 of model merge node was a merge.
|
2023-07-08 22:31:10 -04:00 |
|
comfyanonymous
|
4685d2b07f
|
Merge branch 'condmask-fix' of https://github.com/vmedea/ComfyUI
|
2023-07-07 01:52:25 -04:00 |
|
comfyanonymous
|
9caaa09c71
|
Add arguments to run the VAE in fp16 or bf16 for testing.
|
2023-07-06 23:23:46 -04:00 |
|
comfyanonymous
|
d3b3c94616
|
Fix bug with weights when prompt is long.
|
2023-07-06 02:43:40 -04:00 |
|
comfyanonymous
|
fa8010f038
|
Disable autocast in unet for increased speed.
|
2023-07-05 21:58:29 -04:00 |
|
comfyanonymous
|
c6391df3a5
|
Fix loras not working when loading checkpoint with config.
|
2023-07-05 19:42:24 -04:00 |
|