Commit Graph

18 Commits

Author SHA1 Message Date
comfyanonymous
6c156642e4 Add support for GLIGEN textbox model. 2023-04-19 11:06:32 -04:00
comfyanonymous
4df70d0f62 Fix model_management import so it doesn't get executed twice. 2023-04-15 19:04:33 -04:00
EllangoK
cd00b46465 seperates out arg parser and imports args 2023-04-05 23:41:23 -04:00
comfyanonymous
7da8d5f9f5 Add a TomePatchModel node to the _for_testing section.
Tome increases sampling speed at the expense of quality.
2023-03-31 17:19:58 -04:00
comfyanonymous
bcdd11d687 Add a way to pass options to the transformers blocks. 2023-03-31 13:04:39 -04:00
comfyanonymous
7cae5f5769 Try again with vae tiled decoding if regular fails because of OOM. 2023-03-22 14:49:00 -04:00
comfyanonymous
1de721b33c Add pytorch attention support to VAE. 2023-03-13 12:45:54 -04:00
comfyanonymous
72b42ab260 --disable-xformers should not even try to import xformers. 2023-03-13 11:36:48 -04:00
comfyanonymous
7c95e1a03b Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
2023-03-12 15:44:16 -04:00
comfyanonymous
ffc5c6707b Fix pytorch 2.0 cross attention not working. 2023-03-05 14:14:54 -05:00
comfyanonymous
b2a7f1b32a Make some cross attention functions work on the CPU. 2023-03-03 03:27:33 -05:00
comfyanonymous
666a9e8604 Add some pytorch scaled_dot_product_attention code for testing.
--use-pytorch-cross-attention to use it.
2023-03-02 17:01:20 -05:00
comfyanonymous
151fed3dfb Hopefully fix a strange issue with xformers + lowvram. 2023-02-28 13:48:52 -05:00
comfyanonymous
2124888a6c Remove prints that are useless when xformers is enabled. 2023-02-21 22:16:13 -05:00
comfyanonymous
773cdabfce Same thing but for the other places where it's used. 2023-02-09 12:43:29 -05:00
comfyanonymous
50db297cf6 Try to fix OOM issues with cards that have less vram than mine. 2023-01-29 00:50:46 -05:00
comfyanonymous
051f472e8f Fix sub quadratic attention for SD2 and make it the default optimization. 2023-01-25 01:22:43 -05:00
comfyanonymous
220afe3310 Initial commit. 2023-01-16 22:37:14 -05:00