ComfyUI/tests/inference/testing_nodes/testing-pack
Jacob Segal 46c8311d14 Support for async execution functions
This commit adds support for node execution functions defined as async. When
a node's execution function is defined as async, we can continue
executing other nodes while it is processing.

Standard uses of `await` should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.

In addition to the execute function, the `VALIDATE_INPUTS` and
`check_lazy_status` functions can also be defined as async, though we'll
only resolve one node at a time right now for those.
2025-07-01 14:41:52 -07:00
..
__init__.py Execution Model Inversion (#2666) 2024-08-15 11:21:11 -04:00
conditions.py Execution Model Inversion (#2666) 2024-08-15 11:21:11 -04:00
flow_control.py Move a few files from comfy -> comfy_execution. 2024-08-15 11:21:14 -04:00
specific_tests.py Support for async execution functions 2025-07-01 14:41:52 -07:00
stubs.py Fix a bug where cached outputs affected IS_CHANGED (#4535) 2024-08-21 23:38:46 -04:00
tools.py Execution Model Inversion (#2666) 2024-08-15 11:21:11 -04:00
util.py Add ruff rule for empty line with trailing whitespace. 2024-12-28 05:23:08 -05:00