comfyanonymous
a9e0b4177d
Add callback to sampler function.
...
Callback format is: callback(step, x0, x)
2023-04-27 04:38:44 -04:00
comfyanonymous
d95ef10342
Some fixes to the batch masks PR.
2023-04-25 01:12:40 -04:00
comfyanonymous
288c72fe9f
Refactor more code to sample.py
2023-04-24 23:25:51 -04:00
comfyanonymous
112d4ab7d7
This is cleaner this way.
2023-04-24 22:45:35 -04:00
BlenderNeko
d556513e2a
gligen tuple
2023-04-24 21:47:57 +02:00
pythongosssss
51a069805b
Add progress to vae decode tiled
2023-04-24 11:55:44 +01:00
BlenderNeko
7db702ecd0
made sample functions more explicit
2023-04-24 12:53:10 +02:00
BlenderNeko
ac6686d523
add docstrings
2023-04-23 20:09:09 +02:00
BlenderNeko
1e9a5097e1
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
2023-04-23 20:02:18 +02:00
BlenderNeko
3e6b963e46
refactor/split various bits of code for sampling
2023-04-23 20:02:08 +02:00
comfyanonymous
e6771d0986
Implement Linear hypernetworks.
...
Add a HypernetworkLoader node to use hypernetworks.
2023-04-23 12:35:25 -04:00
comfyanonymous
0d66023475
This makes pytorch2.0 attention perform a bit faster.
2023-04-22 14:30:39 -04:00
comfyanonymous
ed4d73d4fd
Remove some useless code.
2023-04-20 23:58:25 -04:00
comfyanonymous
a86c06e8da
Don't pass adm to model when it doesn't support it.
2023-04-19 21:11:38 -04:00
comfyanonymous
6c156642e4
Add support for GLIGEN textbox model.
2023-04-19 11:06:32 -04:00
comfyanonymous
3fe1252a35
Add a way for nodes to set a custom CFG function.
2023-04-17 11:05:15 -04:00
comfyanonymous
4df70d0f62
Fix model_management import so it doesn't get executed twice.
2023-04-15 19:04:33 -04:00
comfyanonymous
f089d4abc7
Some refactoring: from_tokens -> encode_from_tokens
2023-04-15 18:46:58 -04:00
comfyanonymous
1b821e4d57
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
2023-04-15 14:16:50 -04:00
BlenderNeko
e550f2f84f
fixed improper padding
2023-04-15 19:38:21 +02:00
comfyanonymous
3b9a2f504d
Move code to empty gpu cache to model_management.py
2023-04-15 11:19:07 -04:00
comfyanonymous
0ecef1b4e8
Safely load pickled embeds that don't load with weights_only=True.
2023-04-14 15:33:43 -04:00
BlenderNeko
47b2d342a8
ensure backwards compat with optional args
2023-04-14 21:16:55 +02:00
BlenderNeko
779bed1a43
align behavior with old tokenize function
2023-04-14 21:02:45 +02:00
comfyanonymous
4d8a84520f
Don't stop workflow if loading embedding fails.
2023-04-14 13:54:00 -04:00
BlenderNeko
928cb6b2c7
split tokenizer from encoder
2023-04-13 22:06:50 +02:00
BlenderNeko
02f7bf6cb8
add unique ID per word/embedding for tokenizer
2023-04-13 22:01:01 +02:00
comfyanonymous
9c2869e012
Fix for new transformers version.
2023-04-09 15:55:21 -04:00
comfyanonymous
4861dbb2e2
Print xformers version and warning about 0.0.18
2023-04-09 01:31:47 -04:00
comfyanonymous
7f1c54adeb
Clarify what --windows-standalone-build does.
2023-04-07 15:52:56 -04:00
comfyanonymous
88e5ccb415
Cleanup.
2023-04-07 02:31:46 -04:00
comfyanonymous
d4301d49d3
Fix loading SD1.5 diffusers checkpoint.
2023-04-07 01:30:33 -04:00
comfyanonymous
8aebe865f2
Merge branch 'master' of https://github.com/sALTaccount/ComfyUI
2023-04-07 01:03:43 -04:00
comfyanonymous
d35efcbcb2
Add a --force-fp32 argument to force fp32 for debugging.
2023-04-07 00:27:54 -04:00
comfyanonymous
55a48f27db
Small refactor.
2023-04-06 23:53:54 -04:00
comfyanonymous
69bc45aac3
Merge branch 'ipex' of https://github.com/kwaa/ComfyUI-IPEX
2023-04-06 23:45:29 -04:00
藍+85CD
3adedd52d3
Merge branch 'master' into ipex
2023-04-07 09:11:30 +08:00
EllangoK
b12efcd7b4
fixes lack of support for multi configs
...
also adds some metavars to argarse
2023-04-06 19:06:39 -04:00
comfyanonymous
f2ca0a15ee
Rename the cors parameter to something more verbose.
2023-04-06 15:24:55 -04:00
EllangoK
405685ce89
makes cors a cli parameter
2023-04-06 15:06:22 -04:00
EllangoK
1992795f88
set listen flag to listen on all if specifed
2023-04-06 13:19:00 -04:00
藍+85CD
01c0951a73
Fix auto lowvram detection on CUDA
2023-04-06 15:44:05 +08:00
sALTaccount
671feba9e6
diffusers loader
2023-04-05 23:57:31 -07:00
藍+85CD
23fd4abad1
Use separate variables instead of vram_state
2023-04-06 14:24:47 +08:00
藍+85CD
772741f218
Import intel_extension_for_pytorch as ipex
2023-04-06 12:27:22 +08:00
EllangoK
cd00b46465
seperates out arg parser and imports args
2023-04-05 23:41:23 -04:00
藍+85CD
16f4d42aa0
Add basic XPU device support
...
closed #387
2023-04-05 21:22:14 +08:00
comfyanonymous
7597a5d83e
Disable xformers in VAE when xformers == 0.0.18
2023-04-04 22:22:02 -04:00
comfyanonymous
76a6b372da
Ignore embeddings when sizes don't match and print a WARNING.
2023-04-04 11:49:29 -04:00
comfyanonymous
d96fde8103
Remove print.
2023-04-03 22:58:54 -04:00
comfyanonymous
0d1c6a3934
Pull latest tomesd code from upstream.
2023-04-03 15:49:28 -04:00
comfyanonymous
0e06be56ad
Add noise augmentation setting to unCLIPConditioning.
2023-04-03 13:50:29 -04:00
comfyanonymous
b55667284c
Add support for unCLIP SD2.x models.
...
See _for_testing/unclip in the UI for the new nodes.
unCLIPCheckpointLoader is used to load them.
unCLIPConditioning is used to add the image cond and takes as input a
CLIPVisionEncode output which has been moved to the conditioning section.
2023-04-01 23:19:15 -04:00
comfyanonymous
3567c01bc3
This seems to give better quality in tome.
2023-03-31 18:36:18 -04:00
comfyanonymous
7da8d5f9f5
Add a TomePatchModel node to the _for_testing section.
...
Tome increases sampling speed at the expense of quality.
2023-03-31 17:19:58 -04:00
comfyanonymous
bcdd11d687
Add a way to pass options to the transformers blocks.
2023-03-31 13:04:39 -04:00
comfyanonymous
4b174ce0b4
Fix noise mask not working with > 1 batch size on ksamplers.
2023-03-30 03:50:12 -04:00
comfyanonymous
7e2f4b0897
Split VAE decode batches depending on free memory.
2023-03-29 02:24:37 -04:00
comfyanonymous
7a5ba4593c
Fix ddim_uniform crashing with 37 steps.
2023-03-28 16:29:35 -04:00
Francesco Yoshi Gobbo
be201f0511
code cleanup
2023-03-27 06:48:09 +02:00
Francesco Yoshi Gobbo
28e1209e7e
no lowvram state if cpu only
2023-03-27 04:51:18 +02:00
comfyanonymous
184ea9fd54
Fix ddim for Mac: #264
2023-03-26 00:36:54 -04:00
comfyanonymous
089cd3adc1
I don't think controlnets were being handled correctly by MPS.
2023-03-24 14:33:16 -04:00
comfyanonymous
4f76d503c7
Merge branch 'master' of https://github.com/GaidamakUA/ComfyUI
2023-03-24 13:56:43 -04:00
Yurii Mazurevich
1206cb3386
Fixed typo
2023-03-24 19:39:55 +02:00
comfyanonymous
5d8c210a1d
Make ddim work with --cpu
2023-03-24 11:39:51 -04:00
Yurii Mazurevich
b9b8b1893d
Removed unnecessary comment
2023-03-24 14:15:30 +02:00
Yurii Mazurevich
f500d638c6
Added MPS device support
2023-03-24 14:12:56 +02:00
comfyanonymous
121beab586
Support loha that use cp decomposition.
2023-03-23 04:32:25 -04:00
comfyanonymous
d15c9f2c3b
Add loha support.
2023-03-23 03:40:12 -04:00
comfyanonymous
7cae5f5769
Try again with vae tiled decoding if regular fails because of OOM.
2023-03-22 14:49:00 -04:00
comfyanonymous
d8d65e7c9f
Less seams in tiled outputs at the cost of more processing.
2023-03-22 03:29:09 -04:00
comfyanonymous
13ec5cd3c2
Try to improve VAEEncode memory usage a bit.
2023-03-22 02:45:18 -04:00
comfyanonymous
db20d0a9fe
Add laptop quadro cards to fp32 list.
2023-03-21 16:57:35 -04:00
comfyanonymous
921f4e14a4
Add support for locon mid weights.
2023-03-21 14:51:51 -04:00
comfyanonymous
29728e1119
Try to fix a vram issue with controlnets.
2023-03-19 10:50:38 -04:00
comfyanonymous
508010d114
Fix area composition feathering not working properly.
2023-03-19 02:00:52 -04:00
comfyanonymous
06c7a9b406
Support multiple paths for embeddings.
2023-03-18 03:08:43 -04:00
comfyanonymous
d5bf8038c8
Merge T2IAdapterLoader and ControlNetLoader.
...
Workflows will be auto updated.
2023-03-17 18:17:59 -04:00
comfyanonymous
dc8b43e512
Make --cpu have priority over everything else.
2023-03-13 21:30:01 -04:00
comfyanonymous
889689c7b2
use half() on fp16 models loaded with config.
2023-03-13 21:12:48 -04:00
comfyanonymous
edba761868
Use half() function on model when loading in fp16.
2023-03-13 20:58:09 -04:00
comfyanonymous
5be28c4069
Remove omegaconf dependency and some ci changes.
2023-03-13 14:49:18 -04:00
comfyanonymous
1de721b33c
Add pytorch attention support to VAE.
2023-03-13 12:45:54 -04:00
comfyanonymous
72b42ab260
--disable-xformers should not even try to import xformers.
2023-03-13 11:36:48 -04:00
comfyanonymous
7c95e1a03b
Xformers is now properly disabled when --cpu used.
...
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
2023-03-12 15:44:16 -04:00
comfyanonymous
d777b2f4a9
Add a VAEEncodeTiled node.
2023-03-11 15:28:15 -05:00
comfyanonymous
bbdc5924b4
Try to fix memory issue.
2023-03-11 15:15:13 -05:00
comfyanonymous
adabcc74fb
Make tiled_scale work for downscaling.
2023-03-11 14:58:55 -05:00
comfyanonymous
77290f7110
Tiled upscaling with the upscale models.
2023-03-11 14:04:13 -05:00
comfyanonymous
66cca3659d
Add locon support.
2023-03-09 21:41:24 -05:00
comfyanonymous
fa4dda2550
SD2.x controlnets now work.
2023-03-08 01:13:38 -05:00
comfyanonymous
341ebaabd8
Relative imports to test something.
2023-03-07 11:00:35 -05:00
edikius
148fe7b116
Fixed import ( #44 )
...
* fixed import error
I had an
ImportError: cannot import name 'Protocol' from 'typing'
while trying to update so I fixed it to start an app
* Update main.py
* deleted example files
2023-03-06 11:41:40 -05:00
comfyanonymous
339b2533eb
Fix clip_skip no longer being loaded from yaml file.
2023-03-06 11:34:02 -05:00
comfyanonymous
ca7e2e3827
Add --cpu to use the cpu for inference.
2023-03-06 10:50:50 -05:00
comfyanonymous
663fa6eafd
Implement support for t2i style model.
...
It needs the CLIPVision model so I added CLIPVisionLoader and CLIPVisionEncode.
Put the clip vision model in models/clip_vision
Put the t2i style model in models/style_models
StyleModelLoader to load it, StyleModelApply to apply it
ConditioningAppend to append the conditioning it outputs to a positive one.
2023-03-05 18:39:25 -05:00
comfyanonymous
f8f2ea3bb1
Make VAE use common function to get free memory.
2023-03-05 14:20:07 -05:00
comfyanonymous
ffc5c6707b
Fix pytorch 2.0 cross attention not working.
2023-03-05 14:14:54 -05:00
comfyanonymous
7ee5e2d40e
Add support for new colour T2I adapter model.
2023-03-03 19:13:07 -05:00