comfyanonymous
|
459787f78f
|
Make VAE code closer to sgm.
|
2023-10-17 15:18:51 -04:00 |
|
comfyanonymous
|
28b98a96d3
|
Refactor the attention stuff in the VAE.
|
2023-10-17 03:19:29 -04:00 |
|
comfyanonymous
|
494ddf7717
|
pytorch_attention_enabled can now return True when xformers is enabled.
|
2023-10-11 21:30:57 -04:00 |
|
comfyanonymous
|
c60864b5e4
|
Refactor the attention functions.
There's no reason for the whole CrossAttention object to be repeated when
only the operation in the middle changes.
|
2023-10-11 20:38:48 -04:00 |
|
comfyanonymous
|
ef0c0892f6
|
Add a force argument to soft_empty_cache to force a cache empty.
|
2023-09-04 00:58:18 -04:00 |
|
comfyanonymous
|
2ab478346d
|
Remove optimization that caused border.
|
2023-08-29 11:21:36 -04:00 |
|
comfyanonymous
|
6932eda1fb
|
Fallback to slice attention if xformers doesn't support the operation.
|
2023-08-27 22:24:42 -04:00 |
|
comfyanonymous
|
1b9a6a9599
|
Make --bf16-vae work on torch 2.0
|
2023-08-27 21:33:53 -04:00 |
|
comfyanonymous
|
71ffc7b350
|
Faster VAE loading.
|
2023-07-29 16:28:30 -04:00 |
|
comfyanonymous
|
51da619d73
|
Remove useless code.
|
2023-06-23 12:35:26 -04:00 |
|
comfyanonymous
|
c0a5444d1b
|
Make scaled_dot_product switch to sliced attention on OOM.
|
2023-05-20 16:01:02 -04:00 |
|
comfyanonymous
|
e33cb62d1b
|
Simplify and improve some vae attention code.
|
2023-05-20 15:07:21 -04:00 |
|
comfyanonymous
|
2edaaba3c2
|
Fix imports.
|
2023-05-04 18:10:29 -04:00 |
|
comfyanonymous
|
4df70d0f62
|
Fix model_management import so it doesn't get executed twice.
|
2023-04-15 19:04:33 -04:00 |
|
comfyanonymous
|
7597a5d83e
|
Disable xformers in VAE when xformers == 0.0.18
|
2023-04-04 22:22:02 -04:00 |
|
comfyanonymous
|
7cae5f5769
|
Try again with vae tiled decoding if regular fails because of OOM.
|
2023-03-22 14:49:00 -04:00 |
|
comfyanonymous
|
13ec5cd3c2
|
Try to improve VAEEncode memory usage a bit.
|
2023-03-22 02:45:18 -04:00 |
|
comfyanonymous
|
1de721b33c
|
Add pytorch attention support to VAE.
|
2023-03-13 12:45:54 -04:00 |
|
comfyanonymous
|
72b42ab260
|
--disable-xformers should not even try to import xformers.
|
2023-03-13 11:36:48 -04:00 |
|
comfyanonymous
|
7c95e1a03b
|
Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
|
2023-03-12 15:44:16 -04:00 |
|
comfyanonymous
|
bbdc5924b4
|
Try to fix memory issue.
|
2023-03-11 15:15:13 -05:00 |
|
comfyanonymous
|
f8f2ea3bb1
|
Make VAE use common function to get free memory.
|
2023-03-05 14:20:07 -05:00 |
|
comfyanonymous
|
509c7dfc6d
|
Use real softmax in split op to fix issue with some images.
|
2023-02-10 03:13:49 -05:00 |
|
comfyanonymous
|
773cdabfce
|
Same thing but for the other places where it's used.
|
2023-02-09 12:43:29 -05:00 |
|
comfyanonymous
|
e8c499ddd4
|
Split optimization for VAE attention block.
|
2023-02-08 22:04:20 -05:00 |
|
comfyanonymous
|
5b4e312749
|
Use inplace operations for less OOM issues.
|
2023-02-08 22:04:13 -05:00 |
|
comfyanonymous
|
220afe3310
|
Initial commit.
|
2023-01-16 22:37:14 -05:00 |
|