Commit Graph

392 Commits

Author SHA1 Message Date
Jacob Segal
cb57c901cf Merge remote-tracking branch 'upstream/master' into node_expansion 2023-08-12 13:55:03 -07:00
comfyanonymous
c8a23ce9e8 Support for yet another lora type based on diffusers. 2023-08-11 13:04:21 -04:00
comfyanonymous
2bc12d3d22 Add --temp-directory argument to set temp directory. 2023-08-11 05:13:03 -04:00
comfyanonymous
c20583286f Support diffuser text encoder loras. 2023-08-10 20:28:28 -04:00
comfyanonymous
cf10c5592c Disable calculating uncond when CFG is 1.0 2023-08-09 20:55:03 -04:00
Jacob Segal
a86d383ff3 Code cleanup
While implementing caching for components, I did some cleanup. Despite
the fact that subgraph caching is put on hold for now (in favor of a
larger cache refactor later), these are the changes that I think are
worth keeping anyway.

* Makes subgraph node IDs deterministic
* Allows usage of the topological sort without execution
* Tracks parent nodes (i.e. those that caused a node to be created) and
  display nodes (i.e. the one we want to highlight while an ephemeral
  node is executing) separately.
2023-08-09 00:54:42 -07:00
comfyanonymous
1f0f4cc0bd Add argument to disable auto launching the browser. 2023-08-07 02:25:12 -04:00
comfyanonymous
d8e58f0a7e Detect hint_channels from controlnet. 2023-08-06 14:08:59 -04:00
comfyanonymous
c5d7593ccf Support loras in diffusers format. 2023-08-05 01:40:24 -04:00
comfyanonymous
1ce0d8ad68 Add CMP 30HX card to the nvidia_16_series list. 2023-08-04 12:08:45 -04:00
comfyanonymous
c99d8002f8 Make sure the pooled output stays at the EOS token with added embeddings. 2023-08-03 20:27:50 -04:00
comfyanonymous
4a77fcd6ab Only shift text encoder to vram when CPU cores are under 8. 2023-07-31 00:08:54 -04:00
comfyanonymous
3cd31d0e24 Lower CPU thread check for running the text encoder on the CPU vs GPU. 2023-07-30 17:18:24 -04:00
comfyanonymous
2b13939044 Remove some useless code. 2023-07-30 14:13:33 -04:00
comfyanonymous
95d796fc85 Faster VAE loading. 2023-07-29 16:28:30 -04:00
comfyanonymous
4b957a0010 Initialize the unet directly on the target device. 2023-07-29 14:51:56 -04:00
Jacob Segal
95c8e22fae Merge remote-tracking branch 'upstream/master' into node_expansion 2023-07-29 00:08:44 -07:00
Jacob Segal
5d72965863 Implement conditional Execution Blocking
Execution blocking can be done by returning an `ExecutionBlocker`
(imported from graph_utils) either in place of results or as a specific
output. Any node that uses an `ExecutionBlocker` as input will be
skipped. This operates on a per-entry basis when inputs are lists.

If the `ExecutionBlocker` is initialized with an error message, that
message will be displayed on the first node it's used on (and further
downstream nodes will be silently skipped).
2023-07-28 22:28:18 -07:00
comfyanonymous
c910b4a01c Remove unused code and torchdiffeq dependency. 2023-07-28 21:32:27 -04:00
comfyanonymous
1141029a4a Add --disable-metadata argument to disable saving metadata in files. 2023-07-28 12:31:41 -04:00
comfyanonymous
fbf5c51c1c Merge branch 'fix_batch_timesteps' of https://github.com/asagi4/ComfyUI 2023-07-27 16:13:48 -04:00
comfyanonymous
68be24eead Remove some prints. 2023-07-27 16:12:43 -04:00
asagi4
1ea4d84691 Fix timestep ranges when batch_size > 1 2023-07-27 21:14:09 +03:00
comfyanonymous
5379051d16 Fix diffusers VAE loading. 2023-07-26 18:26:39 -04:00
comfyanonymous
727588d076 Fix some new loras. 2023-07-25 16:39:15 -04:00
comfyanonymous
4f9b6f39d1 Fix potential issue with Save Checkpoint. 2023-07-25 00:45:20 -04:00
comfyanonymous
5f75d784a1 Start is now 0.0 and end is now 1.0 for the timestep ranges. 2023-07-24 18:38:17 -04:00
comfyanonymous
7ff14b62f8 ControlNetApplyAdvanced can now define when controlnet gets applied. 2023-07-24 17:50:49 -04:00
comfyanonymous
d191c4f9ed Add a ControlNetApplyAdvanced node.
The controlnet can be applied to the positive or negative prompt only by
connecting it correctly.
2023-07-24 13:35:20 -04:00
comfyanonymous
0240946ecf Add a way to set which range of timesteps the cond gets applied to. 2023-07-24 09:25:02 -04:00
Jacob Segal
b09620f89c In For loops, display feedback on original node
Rather than displaying live feedback on the "End For" node, the feedback
will be displayed in the UI on the original node.
2023-07-22 23:02:10 -07:00
Jacob Segal
b66253b930 Improve recognition of node linkage
Honestly, I'm still a little concerned here. There's nothing stopping a
custom node from having a data type of ["str",int]. I've improved
recognition to at least prevent the detection of other types, but we
may still want a more systemic fix (e.g. wrapping literals within a
class when using them as inputs to nodes in subgraphs).
2023-07-22 22:38:17 -07:00
comfyanonymous
22f29d66ca Try to fix memory issue with lora. 2023-07-22 21:38:56 -04:00
comfyanonymous
67be7eb81d Nodes can now patch the unet function. 2023-07-22 17:01:12 -04:00
comfyanonymous
12a6e93171 Del the right object when applying lora. 2023-07-22 11:25:49 -04:00
comfyanonymous
78e7958d17 Support controlnet in diffusers format. 2023-07-21 22:58:16 -04:00
comfyanonymous
09386a3697 Fix issue with lora in some cases when combined with model merging. 2023-07-21 21:27:27 -04:00
comfyanonymous
58b2364f58 Properly support SDXL diffusers unet with UNETLoader node. 2023-07-21 14:38:56 -04:00
comfyanonymous
0115018695 Print errors and continue when lora weights are not compatible. 2023-07-20 19:56:22 -04:00
comfyanonymous
4760c29380 Merge branch 'fix-AttributeError-module-'torch'-has-no-attribute-'mps'' of https://github.com/KarryCharon/ComfyUI 2023-07-20 00:34:54 -04:00
comfyanonymous
0b284f650b Fix typo. 2023-07-19 10:20:32 -04:00
comfyanonymous
e032ca6138 Fix ddim issue with older torch versions. 2023-07-19 10:16:00 -04:00
comfyanonymous
18885f803a Add MX450 and MX550 to list of cards with broken fp16. 2023-07-19 03:08:30 -04:00
Jacob Segal
b234baee2c Add lazy evaluation and dynamic node expansion
This PR inverts the execution model -- from recursively calling nodes to
using a topological sort of the nodes. This change allows for
modification of the node graph during execution. This allows for two
major advantages:
1. The implementation of lazy evaluation in nodes. For example, if a
   "Mix Images" node has a mix factor of exactly 0.0, the second image
   input doesn't even need to be evaluated (and visa-versa if the mix
   factor is 1.0).
2. Dynamic expansion of nodes. This allows for the creation of dynamic
   "node groups". Specifically, custom nodes can return subgraphs that
   replace the original node in the graph. This is an *incredibly*
   powerful concept. Using this functionality, it was easy to
   implement:
   a. Components (a.k.a. node groups)
   b. Flow control (i.e. while loops) via tail recursion
   c. All-in-one nodes that replicate the WebUI functionality
   d. and more
All of those were able to be implemented entirely via custom nodes
without hooking or replacing any core functionality. Within this PR,
I've included all of these proof-of-concepts within a custom node pack.
In reality, I would expect some number of them to be merged into the
core node set (with the rest left to be implemented by custom nodes).

I made very few changes to the front-end, so there are probably some
easy UX wins for someone who is more willing to wade into .js land. The
user experience is a lot better than I expected though -- progress shows
correctly in the UI over the nodes that are being expanded.
2023-07-18 20:08:12 -07:00
comfyanonymous
9ba440995a It's actually possible to torch.compile the unet now. 2023-07-18 21:36:35 -04:00
comfyanonymous
51d5477579 Add key to indicate checkpoint is v_prediction when saving. 2023-07-18 00:25:53 -04:00
comfyanonymous
ff6b047a74 Fix device print on old torch version. 2023-07-17 15:18:58 -04:00
comfyanonymous
9871a15cf9 Enable --cuda-malloc by default on torch 2.0 and up.
Add --disable-cuda-malloc to disable it.
2023-07-17 15:12:10 -04:00
comfyanonymous
55d0fca9fa --windows-standalone-build now enables --cuda-malloc 2023-07-17 14:10:36 -04:00
comfyanonymous
1679abd86d Add a command line argument to enable backend:cudaMallocAsync 2023-07-17 11:00:14 -04:00