Commit Graph

132 Commits

Author SHA1 Message Date
Pedro Batista
0169c95d82
Don't use xformers nor fp16 in AMD GPUs
Don't use xformers nor fp16 in AMD GPUs (not supported by them). Example name:

```
In [8]: torch.cuda.get_device_properties("cuda").name
Out[8]: 'AMD Radeon RX 5700 XT'
```
2023-04-05 13:14:15 -03:00
comfyanonymous
e46b1c3034 Disable xformers in VAE when xformers == 0.0.18 2023-04-04 22:22:02 -04:00
comfyanonymous
1718730e80 Ignore embeddings when sizes don't match and print a WARNING. 2023-04-04 11:49:29 -04:00
comfyanonymous
23524ad8c5 Remove print. 2023-04-03 22:58:54 -04:00
comfyanonymous
539ff487a8 Pull latest tomesd code from upstream. 2023-04-03 15:49:28 -04:00
comfyanonymous
f50b1fec69 Add noise augmentation setting to unCLIPConditioning. 2023-04-03 13:50:29 -04:00
comfyanonymous
809bcc8ceb Add support for unCLIP SD2.x models.
See _for_testing/unclip in the UI for the new nodes.

unCLIPCheckpointLoader is used to load them.

unCLIPConditioning is used to add the image cond and takes as input a
CLIPVisionEncode output which has been moved to the conditioning section.
2023-04-01 23:19:15 -04:00
comfyanonymous
0d972b85e6 This seems to give better quality in tome. 2023-03-31 18:36:18 -04:00
comfyanonymous
18a6c1db33 Add a TomePatchModel node to the _for_testing section.
Tome increases sampling speed at the expense of quality.
2023-03-31 17:19:58 -04:00
comfyanonymous
61ec3c9d5d Add a way to pass options to the transformers blocks. 2023-03-31 13:04:39 -04:00
comfyanonymous
afd65d3819 Fix noise mask not working with > 1 batch size on ksamplers. 2023-03-30 03:50:12 -04:00
comfyanonymous
b2554bc4dd Split VAE decode batches depending on free memory. 2023-03-29 02:24:37 -04:00
comfyanonymous
0d65cb17b7 Fix ddim_uniform crashing with 37 steps. 2023-03-28 16:29:35 -04:00
Francesco Yoshi Gobbo
f55755f0d2
code cleanup 2023-03-27 06:48:09 +02:00
Francesco Yoshi Gobbo
cf0098d539
no lowvram state if cpu only 2023-03-27 04:51:18 +02:00
comfyanonymous
f5365c9c81 Fix ddim for Mac: #264 2023-03-26 00:36:54 -04:00
comfyanonymous
4adcea7228 I don't think controlnets were being handled correctly by MPS. 2023-03-24 14:33:16 -04:00
comfyanonymous
3c6ff8821c Merge branch 'master' of https://github.com/GaidamakUA/ComfyUI 2023-03-24 13:56:43 -04:00
Yurii Mazurevich
fc71e7ea08 Fixed typo 2023-03-24 19:39:55 +02:00
comfyanonymous
7f0fd99b5d Make ddim work with --cpu 2023-03-24 11:39:51 -04:00
Yurii Mazurevich
4b943d2b60 Removed unnecessary comment 2023-03-24 14:15:30 +02:00
Yurii Mazurevich
89fd5ed574 Added MPS device support 2023-03-24 14:12:56 +02:00
comfyanonymous
dd095efc2c Support loha that use cp decomposition. 2023-03-23 04:32:25 -04:00
comfyanonymous
94a7c895f4 Add loha support. 2023-03-23 03:40:12 -04:00
comfyanonymous
3ed4a4e4e6 Try again with vae tiled decoding if regular fails because of OOM. 2023-03-22 14:49:00 -04:00
comfyanonymous
4039616ca6 Less seams in tiled outputs at the cost of more processing. 2023-03-22 03:29:09 -04:00
comfyanonymous
c692509c2b Try to improve VAEEncode memory usage a bit. 2023-03-22 02:45:18 -04:00
comfyanonymous
9d0665c8d0 Add laptop quadro cards to fp32 list. 2023-03-21 16:57:35 -04:00
comfyanonymous
cc309568e1 Add support for locon mid weights. 2023-03-21 14:51:51 -04:00
comfyanonymous
edfc4ca663 Try to fix a vram issue with controlnets. 2023-03-19 10:50:38 -04:00
comfyanonymous
b4b21be707 Fix area composition feathering not working properly. 2023-03-19 02:00:52 -04:00
comfyanonymous
50099bcd96 Support multiple paths for embeddings. 2023-03-18 03:08:43 -04:00
comfyanonymous
2e73367f45 Merge T2IAdapterLoader and ControlNetLoader.
Workflows will be auto updated.
2023-03-17 18:17:59 -04:00
comfyanonymous
ee46bef03a Make --cpu have priority over everything else. 2023-03-13 21:30:01 -04:00
comfyanonymous
0e836d525e use half() on fp16 models loaded with config. 2023-03-13 21:12:48 -04:00
comfyanonymous
986dd820dc Use half() function on model when loading in fp16. 2023-03-13 20:58:09 -04:00
comfyanonymous
54dbfaf2ec Remove omegaconf dependency and some ci changes. 2023-03-13 14:49:18 -04:00
comfyanonymous
83f23f82b8 Add pytorch attention support to VAE. 2023-03-13 12:45:54 -04:00
comfyanonymous
a256a2abde --disable-xformers should not even try to import xformers. 2023-03-13 11:36:48 -04:00
comfyanonymous
0f3ba7482f Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
2023-03-12 15:44:16 -04:00
comfyanonymous
e33dc2b33b Add a VAEEncodeTiled node. 2023-03-11 15:28:15 -05:00
comfyanonymous
1de86851b1 Try to fix memory issue. 2023-03-11 15:15:13 -05:00
comfyanonymous
2b1fce2943 Make tiled_scale work for downscaling. 2023-03-11 14:58:55 -05:00
comfyanonymous
9db2e97b47 Tiled upscaling with the upscale models. 2023-03-11 14:04:13 -05:00
comfyanonymous
cd64111c83 Add locon support. 2023-03-09 21:41:24 -05:00
comfyanonymous
c70f0ac64b SD2.x controlnets now work. 2023-03-08 01:13:38 -05:00
comfyanonymous
19415c3ace Relative imports to test something. 2023-03-07 11:00:35 -05:00
edikius
165be5828a
Fixed import (#44)
* fixed import error

I had an
ImportError: cannot import name 'Protocol' from 'typing'
while trying to update so I fixed it to start an app

* Update main.py

* deleted example files
2023-03-06 11:41:40 -05:00
comfyanonymous
501f19eec6 Fix clip_skip no longer being loaded from yaml file. 2023-03-06 11:34:02 -05:00
comfyanonymous
afff30fc0a Add --cpu to use the cpu for inference. 2023-03-06 10:50:50 -05:00