ComfyUI/comfy_execution
Jacob Segal f1dc13037e Support for async execution functions
This commit adds support for node execution functions defined as async. When
a node's execution function is defined as async, we can continue
executing other nodes while it is processing.

Standard uses of `await` should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.

In addition to the execute function, the `VALIDATE_INPUTS` and
`check_lazy_status` functions can also be defined as async, though we'll
only resolve one node at a time right now for those.
2025-06-13 21:39:26 -07:00
..
caching.py Support for async execution functions 2025-06-13 21:39:26 -07:00
graph_utils.py Move a few files from comfy -> comfy_execution. 2024-08-15 11:21:14 -04:00
graph.py Support for async execution functions 2025-06-13 21:39:26 -07:00
progress.py Support for async execution functions 2025-06-13 21:39:26 -07:00
utils.py Support for async execution functions 2025-06-13 21:39:26 -07:00
validation.py Reland union type (#5900) 2024-12-04 15:12:10 -05:00