* lets toggle this setting first.
* also makes it easier for debug. I'll be honest this is generally preferred behavior as well for me but I ain't no power user shrug.
* attempting trick to put the work for filter: brightness on GPU as a first attempt before falling back to not using filter for large lists!
* revert litegraph.core.js changes from branch
* oops
- Binary previews are not yet supported
- Use `--distributed-queue-connection-uri=amqp://guest:guest@rabbitmqserver/`
- Roles supported: frontend, worker or both (see `--help`)
- Run `comfy-worker` for a lightweight worker you can wrap your head
around
- Workers and frontends must have the same directory structure (set
with `--cwd`) and supported nodes. Frontends must still have access
to inputs and outputs.
- Configuration notes:
distributed_queue_connection_uri (Optional[str]): Servers and clients will connect to this AMQP URL to form a distributed queue and exchange prompt execution requests and progress updates.
distributed_queue_roles (List[str]): Specifies one or more roles for the distributed queue. Acceptable values are "worker" or "frontend", or both by writing the flag twice with each role. Frontends will start the web UI and connect to the provided AMQP URL to submit prompts; workers will pull requests off the AMQP URL.
distributed_queue_name (str): This name will be used by the frontends and workers to exchange prompt requests and replies. Progress updates will be prefixed by the queue name, followed by a '.', then the user ID.
- Run comfyui workflows directly inside other python applications using
EmbeddedComfyClient.
- Optional telemetry in prompts and models using anonymity preserving
Plausible self-hosted or hosted.
- Better OpenAPI schema
- Basic support for distributed ComfyUI backends. Limitations: no
progress reporting, no easy way to start your own distributed
backend, requires RabbitMQ as a message broker.