Commit Graph

30 Commits

Author SHA1 Message Date
comfyanonymous
6169965734 Parsing error crash. 2023-05-22 20:51:30 -04:00
comfyanonymous
5ee1ea3988 Add a node_errors to the /prompt error json response.
"node_errors" contains a dict keyed by node ids. The contents are a message
and a list of dependent outputs.
2023-05-22 13:22:38 -04:00
comfyanonymous
ea32c601ea Print min and max values in validation error message. 2023-05-21 00:24:28 -04:00
comfyanonymous
5ab1c07017 Fix outputs gone from history. 2023-05-15 00:27:28 -04:00
comfyanonymous
08bea5f9a7 Print prompt execution time. 2023-05-14 01:34:25 -04:00
comfyanonymous
773d00ab0a Add the prompt id to some websocket messages. 2023-05-13 11:17:16 -04:00
BlenderNeko
dd1be4e992 Make nodes map over input lists (#579)
* allow nodes to map over lists

* make work with IS_CHANGED and VALIDATE_INPUTS

* give list outputs distinct socket shape

* add rebatch node

* add batch index logic

* add repeat latent batch

* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
comfyanonymous
b698ff5ce4 Add the prompt_id to some websocket messages. 2023-05-11 01:22:40 -04:00
comfyanonymous
f031ed3b4e Send websocket message with list of cached nodes right before execution. 2023-05-10 15:59:24 -04:00
comfyanonymous
5bd99f38c3 Send execution_error message on websocket on execution exception. 2023-05-10 15:49:49 -04:00
comfyanonymous
d8edfe1f07 Only validate each input once. 2023-05-10 00:29:31 -04:00
comfyanonymous
853b649b1d Don't print traceback when processing interrupted. 2023-05-09 23:51:52 -04:00
comfyanonymous
26380e84df If IS_CHANGED returns exception delete the output instead of crashing. 2023-04-26 02:13:56 -04:00
comfyanonymous
aa8889aa60 Don't keep cached outputs for removed nodes. 2023-04-26 02:05:57 -04:00
comfyanonymous
0c0b4c41f7 Don't delete all outputs when execution gets interrupted. 2023-04-23 22:44:38 -04:00
comfyanonymous
04da98df66 Add a way for nodes to validate their own inputs. 2023-04-23 16:03:26 -04:00
ltdrdata
73ff5c5278 Add clipspace feature. (#541)
* Add clipspace feature.
* feat: copy content to clipspace
* feat: paste content from clipspace

Extend validation to allow for validating annotated_path in addition to other parameters.

Add support for annotated_filepath in folder_paths function.

Generalize the '/upload/image' API to allow for uploading images to the 'input', 'temp', or 'output' directories.

* rename contentClipboard -> clipspace

* Do deep copy for imgs on copy to clipspace.

* add original_imgs into clipspace
* Preserve the original image when 'imgs' are modified

* robust patch & refactoring folder_paths about annotated_filepath

* Only show the Paste menu if the ComfyApp.clipspace is not empty

* instant refresh on paste

force triggering 'changed' on paste action

* subfolder fix on paste logic

attach subfolder if subfolder isn't empty

---------

Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>
2023-04-23 15:58:55 -04:00
comfyanonymous
3b9a2f504d Move code to empty gpu cache to model_management.py 2023-04-15 11:19:07 -04:00
藍+85CD
47e0c9415a Support releases all unoccupied cached memory from XPU 2023-04-15 15:50:51 +08:00
pythongosssss
8a1dcd1adf Allows nodes to return ui data and output data
Fire executed event on node when message received
2023-03-29 18:53:24 +01:00
Davemane42
7f2e0d1d1a add unique_id to nodes hidden inputs
@classmethod
    def INPUT_TYPES(cls):
        return {
            "hidden": {"unique_id": "UNIQUE_ID"},
        }
2023-03-28 02:52:12 -04:00
comfyanonymous
7800fecf22 Fix errors appearing more than once. 2023-03-27 02:16:58 -04:00
comfyanonymous
35a3b7de52 Fix IS_CHANGED not working on nodes with an input from another node. 2023-03-27 01:56:22 -04:00
comfyanonymous
1634d030fb Use inference_mode instead of no_grad. 2023-03-22 03:48:26 -04:00
pythongosssss
e69202b33e Updated to reuse session id if available 2023-03-07 13:24:15 +00:00
comfyanonymous
40fe09c7b8 Add a button to interrupt processing to the ui. 2023-03-02 15:24:51 -05:00
comfyanonymous
b59b82a73b Add a way to interrupt current processing in the backend. 2023-03-02 14:42:03 -05:00
comfyanonymous
392364779f Only clear cuda cache on CUDA since it causes slowdowns on ROCm. 2023-02-28 13:39:30 -05:00
comfyanonymous
fc9323caf2 Try to clear more memory at the end of each prompt execution. 2023-02-28 11:56:33 -05:00
comfyanonymous
f05d2bab79 Move some stuff from main.py to execution.py 2023-02-27 19:44:58 -05:00