Commit Graph

1417 Commits

Author SHA1 Message Date
guill
7d4530f6f5
Rework Caching (#2)
This commit solves a number of bugs and adds some caching related
functionality. Specifically:

1. Caching is now input-based. In cases of completely identical nodes,
   the output will be reused (for example, if you have multiple
   LoadCheckpoint nodes loading the same checkpoint). If a node doesn't
   want this behavior (e.g. a `RandomInteger` node, it should set
   `NOT_IDEMPOTENT = True`.
2. This means that nodes within a component will now be cached and will
   only change if the input actually changes. Note that types that can't
   be hashed by default will always count as changed (though the
   component itself will only expand if one of its inputs changes).
3. A new LRU caching strategy is now available by starting with
   `--cache-lru 100`. With this strategy, in addition to the latest
   workflow being cached, up to N (100 in the example) node outputs will
   be retained. This allows users to work on multiple workflows or
   experiment with different inputs without losing the benefits of
   caching (at the cost of more RAM and VRAM). I intentionally left some
   additional debug print statements in for this strategy for the
   moment.
4. A new endpoint `/debugcache` has been temporarily added to assist
   with tracking down issues people encounter. It allows you to browse
   the contents of the cache.
5. Outputs from ephemeral nodes will now be communicated to the
   front-end with both the ephemeral node id, the 'parent' node id, and
   the 'display' node id. The front-end has been updated to deal with
   this.
2023-09-11 19:53:41 -07:00
Jacob Segal
f15bd84351 Merge remote-tracking branch 'upstream/master' into node_expansion 2023-09-02 21:08:31 -07:00
comfyanonymous
a74c5dbf37 Move some functions to utils.py 2023-09-02 22:33:37 -04:00
comfyanonymous
766c7b3815 Update upscale model code to latest Chainner model code.
Don't add SRFormer because the code license is incompatible with the GPL.

Remove MAT because it's unused and the license is incompatible with GPL.
2023-09-02 22:27:40 -04:00
comfyanonymous
62efc78a4b Display history in reverse order to make it easier to load last gen. 2023-09-02 15:49:16 -04:00
comfyanonymous
6962cb46a9 Fix issue when node_input is undefined. 2023-09-02 12:17:30 -04:00
comfyanonymous
7291e303f6 Fix issue with some workflows not getting serialized. 2023-09-02 11:48:44 -04:00
comfyanonymous
77a176f9e0 Use common function to reshape batch to. 2023-09-02 03:42:49 -04:00
comfyanonymous
36ea8784a8 Only return tuple of 3 args in CheckpointLoaderSimple. 2023-09-02 03:34:57 -04:00
Muhammed Yusuf
7891d13329
Added label for autoQueueCheckbox. (#1295)
* Added label for autoQueueCheckbox.

* Menu gets behind of some custom nodes.

* Edited extraOptions.
Options divided in to different divs to manage them with ease.
2023-09-02 02:58:23 -04:00
comfyanonymous
7931ff0fd9 Support SDXL inpaint models. 2023-09-01 15:22:52 -04:00
comfyanonymous
c335fdf200 Merge branch 'pixelass-patch-1' of https://github.com/pixelass/ComfyUI 2023-09-01 11:48:11 -04:00
comfyanonymous
43f2505389 Merge branch 'fix/widget-wonkyness' of https://github.com/M1kep/ComfyUI 2023-09-01 03:07:10 -04:00
comfyanonymous
0e3b641172 Remove xformers related print. 2023-09-01 02:12:03 -04:00
comfyanonymous
5c363a9d86 Fix controlnet bug. 2023-09-01 02:01:08 -04:00
Michael Poutre
69c5e6de85
fix(widgets): Add options object if not present when forceInput: true 2023-08-31 17:58:43 -07:00
Michael Poutre
9a7a52f8b5
refactor/fix: Treat forceInput widgets as standard widgets 2023-08-31 17:58:43 -07:00
comfyanonymous
cfe1c54de8 Fix controlnet issue. 2023-08-31 15:16:58 -04:00
comfyanonymous
57beace324 Fix VAEDecodeTiled minimum. 2023-08-31 14:26:16 -04:00
comfyanonymous
1c012d69af It doesn't make sense for c_crossattn and c_concat to be lists. 2023-08-31 13:25:00 -04:00
comfyanonymous
5f101f4da1 Update litegraph with upstream: middle mouse dragging. 2023-08-31 02:39:34 -04:00
Ridan Vandenbergh
2cd3980199 Remove forced lowercase on embeddings endpoint 2023-08-30 20:48:55 +02:00
comfyanonymous
7e941f9f24 Clean up DiffusersLoader node. 2023-08-30 12:57:07 -04:00
Simon Lui
18617967e5
Fix error message in model_patcher.py
Found while tinkering.
2023-08-30 00:25:04 -07:00
comfyanonymous
fe4c07400c Fix "Load Checkpoint with config" node. 2023-08-29 23:58:32 -04:00
comfyanonymous
d70b0bc43c Use the GPU for the canny preprocessor when available. 2023-08-29 17:58:40 -04:00
comfyanonymous
81d9200e18 Add node to convert a specific colour in an image to a mask. 2023-08-29 17:55:42 -04:00
comfyanonymous
f2f5e5dcbb Support SDXL t2i adapters with 3 channel input. 2023-08-29 16:44:57 -04:00
comfyanonymous
15adc3699f Move beta_schedule to model_config and allow disabling unet creation. 2023-08-29 14:22:53 -04:00
comfyanonymous
968078b149 Merge branch 'feat/mute_bypass_nodes_in_group' of https://github.com/M1kep/ComfyUI 2023-08-29 11:33:40 -04:00
comfyanonymous
66c690e698 Merge branch 'preserve-pnginfo' of https://github.com/chrisgoringe/ComfyUI 2023-08-29 11:32:58 -04:00
comfyanonymous
bed116a1f9 Remove optimization that caused border. 2023-08-29 11:21:36 -04:00
Chris
18379dea36 check for text attr and save 2023-08-29 18:50:28 +10:00
Chris
edcff9ab8a copy metadata into modified image 2023-08-29 18:50:28 +10:00
Michael Poutre
6944288aff
refactor(ui): Switch statement, and handle other modes in group actions 2023-08-29 00:24:31 -07:00
Michael Poutre
e30d546e38
feat(ui): Add node mode toggles to group context menu 2023-08-28 23:49:25 -07:00
comfyanonymous
8ddd081b09 Use the same units for tile size in VAEDecodeTiled and VAEEncodeTiled. 2023-08-29 01:51:35 -04:00
comfyanonymous
fbf375f161 Merge branch 'master' of https://github.com/bvhari/ComfyUI 2023-08-29 01:42:00 -04:00
comfyanonymous
65cae62c71 No need to check filename extensions to detect shuffle controlnet. 2023-08-28 16:49:06 -04:00
comfyanonymous
4e89b2c25a Put clip vision outputs on the CPU. 2023-08-28 16:26:11 -04:00
comfyanonymous
a094b45c93 Load clipvision model to GPU for faster performance. 2023-08-28 15:29:27 -04:00
comfyanonymous
1300a1bb4c Text encoder should initially load on the offload_device not the regular. 2023-08-28 15:08:45 -04:00
comfyanonymous
f92074b84f Move ModelPatcher to model_patcher.py 2023-08-28 14:51:31 -04:00
BVH
d86b222fe9
Reduce min tile size for encode 2023-08-28 22:39:09 +05:30
comfyanonymous
4798cf5a62 Implement loras with norm keys. 2023-08-28 11:20:06 -04:00
BVH
9196588088
Make tile size in Tiled VAE encode/decode user configurable 2023-08-28 19:57:22 +05:30
Dr.Lt.Data
0faee1186f
support on prompt event handler (#765)
Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>
2023-08-28 00:52:22 -04:00
comfyanonymous
b8c7c770d3 Enable bf16-vae by default on ampere and up. 2023-08-27 23:06:19 -04:00
comfyanonymous
1c794a2161 Fallback to slice attention if xformers doesn't support the operation. 2023-08-27 22:24:42 -04:00
comfyanonymous
d935ba50c4 Make --bf16-vae work on torch 2.0 2023-08-27 21:33:53 -04:00