This commit solves a number of bugs and adds some caching related
functionality. Specifically:
1. Caching is now input-based. In cases of completely identical nodes,
the output will be reused (for example, if you have multiple
LoadCheckpoint nodes loading the same checkpoint). If a node doesn't
want this behavior (e.g. a `RandomInteger` node, it should set
`NOT_IDEMPOTENT = True`.
2. This means that nodes within a component will now be cached and will
only change if the input actually changes. Note that types that can't
be hashed by default will always count as changed (though the
component itself will only expand if one of its inputs changes).
3. A new LRU caching strategy is now available by starting with
`--cache-lru 100`. With this strategy, in addition to the latest
workflow being cached, up to N (100 in the example) node outputs will
be retained. This allows users to work on multiple workflows or
experiment with different inputs without losing the benefits of
caching (at the cost of more RAM and VRAM). I intentionally left some
additional debug print statements in for this strategy for the
moment.
4. A new endpoint `/debugcache` has been temporarily added to assist
with tracking down issues people encounter. It allows you to browse
the contents of the cache.
5. Outputs from ephemeral nodes will now be communicated to the
front-end with both the ephemeral node id, the 'parent' node id, and
the 'display' node id. The front-end has been updated to deal with
this.
* Added label for autoQueueCheckbox.
* Menu gets behind of some custom nodes.
* Edited extraOptions.
Options divided in to different divs to manage them with ease.
Similar to fixes in litegraph.js editor demo:
3ef215cf11/editor/js/code.js (L19-L28)
Also workarounds to address viewpoint problem of lightgrapgh.js in DPI scaling scenario.
Fixes#161
This enables local reverse-proxies to host ComfyUI on a path, eg "http://example.com/ComfyUI/" in such a way that at least everything I tested works. Without this patch, proxying ComfyUI in this way will yield errors.
The created checkpoints contain workflow metadata that can be loaded by
dragging them on top of the UI or loading them with the "Load" button.
Checkpoints will be saved in fp16 or fp32 depending on the format ComfyUI
is using for inference on your hardware. To force fp32 use: --force-fp32
Anything that patches the model weights like merging or loras will be
saved.
The output directory is currently set to: output/checkpoints but that might
change in the future.
* support preview mode for mask editor.
* use original file reference instead of loaded frontend blob
bugfix:
* prevent file open dialog when save to load image
* bugfix: cannot clear previous mask painted image's alpha
* bugfix
* bugfix
---------
Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>