Commit Graph

247 Commits

Author SHA1 Message Date
hlky
da1ab7b298
Merge branch 'master' into aitemplate 2023-06-01 13:19:40 +01:00
hlky
4b8a650932 AITemplate ControlNet 2023-06-01 13:14:27 +01:00
comfyanonymous
5c38958e49 Tweak lowvram model memory so it's closer to what it was before. 2023-06-01 04:04:35 -04:00
comfyanonymous
94680732d3 Empty cache on mps. 2023-06-01 03:52:51 -04:00
hlky
fbc74fbb25 AITemplate uses main sampling function 2023-05-31 20:52:20 +01:00
comfyanonymous
03da8a3426 This is useless for inference. 2023-05-31 13:03:24 -04:00
comfyanonymous
eb448dd8e1 Auto load model in lowvram if not enough memory. 2023-05-30 12:36:41 -04:00
comfyanonymous
b9818eb910 Add route to get safetensors metadata:
/view_metadata/loras?filename=lora.safetensors
2023-05-29 02:48:50 -04:00
comfyanonymous
a532888846 Support VAEs in diffusers format. 2023-05-28 02:02:09 -04:00
comfyanonymous
0fc483dcfd Refactor diffusers model convert code to be able to reuse it. 2023-05-28 01:55:40 -04:00
comfyanonymous
eb4bd7711a Remove einops. 2023-05-25 18:42:56 -04:00
comfyanonymous
87ab25fac7 Do operations in same order as the one it replaces. 2023-05-25 18:31:27 -04:00
comfyanonymous
2b1fac9708 Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI 2023-05-25 14:44:16 -04:00
comfyanonymous
e1278fa925 Support old pytorch versions that don't have weights_only. 2023-05-25 13:30:59 -04:00
BlenderNeko
8b4b0c3188 vecorized bislerp 2023-05-25 19:23:47 +02:00
comfyanonymous
b8ccbec6d8 Various improvements to bislerp. 2023-05-23 11:40:24 -04:00
comfyanonymous
34887b8885 Add experimental bislerp algorithm for latent upscaling.
It's like bilinear but with slerp.
2023-05-23 03:12:56 -04:00
hlky
7e4da3c48a
Merge branch 'master' into aitemplate 2023-05-22 08:42:54 +01:00
comfyanonymous
6cc450579b Auto transpose images from exif data. 2023-05-22 00:22:24 -04:00
comfyanonymous
dc198650c0 sample_dpmpp_2m_sde no longer crashes when step == 1. 2023-05-21 11:34:29 -04:00
comfyanonymous
069657fbf3 Add DPM-Solver++(2M) SDE and exponential scheduler.
exponential scheduler is the one recommended with this sampler.
2023-05-21 01:46:03 -04:00
comfyanonymous
b8636a44aa Make scaled_dot_product switch to sliced attention on OOM. 2023-05-20 16:01:02 -04:00
comfyanonymous
797c4e8d3b Simplify and improve some vae attention code. 2023-05-20 15:07:21 -04:00
hlky
d879a33956 Fix for long prompts 2023-05-16 20:55:19 +01:00
hlky
6769b918fb Fix for negative prompt in AITemplateModelWrapper 2023-05-16 18:44:51 +01:00
hlky
f380b89a71 dynamic batch size 2023-05-16 08:21:50 +01:00
hlky
c8195ab3e5 Pass AITemplate, MODEL to AITemplate KSampler, -diffusers req
* Pass MODEL to AITemplate KSampler, but don't move to device
* Take alphas_cumprod from MODEL, -diffusers req, move alphas_cumprod only to device
* -checks for AITemplateModelWrapper, inpaint etc maybe still won't work, untested
  * v2 should work, but will require module compiled for v2
2023-05-16 07:15:12 +01:00
hlky
6c1c6232eb Update samplers.py 2023-05-15 19:35:51 +01:00
hlky
abba81d8a6 fix import 2023-05-15 19:24:00 +01:00
hlky
eccba18c17 include aitemplate Model 2023-05-15 19:15:50 +01:00
hlky
b32c2eaafd aitemplate 2023-05-15 19:11:30 +01:00
comfyanonymous
ef815ba1e2 Switch default scheduler to normal. 2023-05-15 00:29:56 -04:00
comfyanonymous
68d12b530e Merge branch 'tiled_sampler' of https://github.com/BlenderNeko/ComfyUI 2023-05-14 15:39:39 -04:00
comfyanonymous
3a1f47764d Print the torch device that is used on startup. 2023-05-13 17:11:27 -04:00
BlenderNeko
1201d2eae5
Make nodes map over input lists (#579)
* allow nodes to map over lists

* make work with IS_CHANGED and VALIDATE_INPUTS

* give list outputs distinct socket shape

* add rebatch node

* add batch index logic

* add repeat latent batch

* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
BlenderNeko
19c014f429 comment out annoying print statement 2023-05-12 23:57:40 +02:00
BlenderNeko
d9e088ddfd minor changes for tiled sampler 2023-05-12 23:49:09 +02:00
comfyanonymous
f7c0f75d1f Auto batching improvements.
Try batching when cond sizes don't match with smart padding.
2023-05-10 13:59:24 -04:00
comfyanonymous
314e526c5c Not needed anymore because sampling works with any latent size. 2023-05-09 12:18:18 -04:00
comfyanonymous
c6e34963e4 Make t2i adapter work with any latent resolution. 2023-05-08 18:15:19 -04:00
comfyanonymous
a1f12e370d Merge branch 'autostart' of https://github.com/EllangoK/ComfyUI 2023-05-07 17:19:03 -04:00
comfyanonymous
6fc4917634 Make maximum_batch_area take into account python2.0 attention function.
More conservative xformers maximum_batch_area.
2023-05-06 19:58:54 -04:00
comfyanonymous
678f933d38 maximum_batch_area for xformers.
Remove useless code.
2023-05-06 19:28:46 -04:00
EllangoK
8e03c789a2 auto-launch cli arg 2023-05-06 16:59:40 -04:00
comfyanonymous
cb1551b819 Lowvram mode for gligen and fix some lowvram issues. 2023-05-05 18:11:41 -04:00
comfyanonymous
af9cc1fb6a Search recursively in subfolders for embeddings. 2023-05-05 01:28:48 -04:00
comfyanonymous
6ee11d7bc0 Fix import. 2023-05-05 00:19:35 -04:00
comfyanonymous
bae4fb4a9d Fix imports. 2023-05-04 18:10:29 -04:00
comfyanonymous
fcf513e0b6 Refactor. 2023-05-03 17:48:35 -04:00
comfyanonymous
a74e176a24 Merge branch 'tiled-progress' of https://github.com/pythongosssss/ComfyUI 2023-05-03 16:24:56 -04:00