Commit Graph

1097 Commits

Author SHA1 Message Date
comfyanonymous
9424522ead Reuse code. 2024-06-11 07:20:26 -04:00
Dango233
73ce178021
Remove redundancy in mmdit.py (#3685) 2024-06-11 06:30:25 -04:00
doctorpangloss
6789e9c71e Add Reference Only ControlNet hack, add ControlNet++ models 2024-06-10 20:36:23 -07:00
doctorpangloss
e7682ced56 Better support for transformers t5 2024-06-10 20:22:17 -07:00
doctorpangloss
a5d828be77 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-06-10 13:21:36 -07:00
comfyanonymous
a82fae2375 Fix bug with cosxl edit model. 2024-06-10 16:00:03 -04:00
comfyanonymous
8c4a9befa7 SD3 Support. 2024-06-10 14:06:23 -04:00
doctorpangloss
d778277a68 Merge with upstream and fix tests 2024-06-10 10:01:08 -07:00
comfyanonymous
a5e6a632f9 Support sampling non 2D latents. 2024-06-10 01:31:09 -04:00
comfyanonymous
742d5720d1 Support zeroing out text embeddings with the attention mask. 2024-06-09 16:51:58 -04:00
comfyanonymous
6cd8ffc465 Reshape the empty latent image to the right amount of channels if needed. 2024-06-08 02:35:08 -04:00
doctorpangloss
7f300bcb7a Multi-modal LLM support and ongoing improvements to language features. 2024-06-07 16:23:10 -07:00
comfyanonymous
56333d4850 Use the end token for the text encoder attention mask. 2024-06-07 03:05:23 -04:00
doctorpangloss
6575409461 Additional chat templates to ease the use of many models. 2024-06-06 20:51:05 -07:00
doctorpangloss
ebf2ef27c7 Improve LLM / language support 2024-06-06 14:57:52 -07:00
comfyanonymous
104fcea0c8 Add function to get the list of currently loaded models. 2024-06-05 23:25:16 -04:00
comfyanonymous
b1fd26fe9e pytorch xpu should be flash or mem efficient attention? 2024-06-04 17:44:14 -04:00
doctorpangloss
3f559135c6 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-06-03 11:42:55 -07:00
comfyanonymous
809cc85a8e Remove useless code. 2024-06-02 19:23:37 -04:00
comfyanonymous
b249862080 Add an annoying print to a function I want to remove. 2024-06-01 12:47:31 -04:00
doctorpangloss
5151d5b017 Include HyperSD checkpoints 2024-05-31 16:21:10 -07:00
doctorpangloss
cb557c960b Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-31 07:42:11 -07:00
doctorpangloss
3125366eda Improve compatibility with comfyui-extra-models, improve API 2024-05-30 16:50:34 -07:00
comfyanonymous
bf3e334d46 Disable non_blocking when --deterministic or directml. 2024-05-30 11:07:38 -04:00
JettHu
b26da2245f
Fix UnetParams annotation typo (#3589) 2024-05-27 19:30:35 -04:00
comfyanonymous
0920e0e5fe Remove some unused imports. 2024-05-27 19:08:27 -04:00
comfyanonymous
ffc4b7c30e Fix DORA strength.
This is a different version of #3298 with more correct behavior.
2024-05-25 02:50:11 -04:00
comfyanonymous
efa5a711b2 Reduce memory usage when applying DORA: #3557 2024-05-24 23:36:48 -04:00
doctorpangloss
801ef2e3f0 Fix interrupt messaging, add AMD and Intel Dockerfiles 2024-05-23 22:51:44 -07:00
doctorpangloss
a79ccd625f bf16 selection for AMD 2024-05-22 22:45:15 -07:00
doctorpangloss
35cf996b68 ROCm 6.0 seems to require get_device_name to be called before memory methods in order to return valid data 2024-05-22 22:09:07 -07:00
comfyanonymous
6c23854f54 Fix OSX latent2rgb previews. 2024-05-22 13:56:28 -04:00
Chenlei Hu
7718ada4ed
Add type annotation UnetWrapperFunction (#3531)
* Add type annotation UnetWrapperFunction

* nit

* Add types.py
2024-05-22 02:07:27 -04:00
comfyanonymous
8508df2569 Work around black image bug on Mac 14.5 by forcing attention upcasting. 2024-05-21 16:56:33 -04:00
doctorpangloss
b241ecc56d Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-21 11:38:24 -07:00
comfyanonymous
83d969e397 Disable xformers when tracing model. 2024-05-21 13:55:49 -04:00
doctorpangloss
f69b6225c0 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-20 12:06:35 -07:00
comfyanonymous
1900e5119f Fix potential issue. 2024-05-20 08:19:54 -04:00
comfyanonymous
09e069ae6c Log the pytorch version. 2024-05-20 06:22:29 -04:00
comfyanonymous
11a2ad5110 Fix controlnet not upcasting on models that have it enabled. 2024-05-19 17:58:03 -04:00
comfyanonymous
0bdc2b15c7 Cleanup. 2024-05-18 10:11:44 -04:00
comfyanonymous
98f828fad9 Remove unnecessary code. 2024-05-18 09:36:44 -04:00
doctorpangloss
519cddcefc Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-17 14:04:44 -07:00
doctorpangloss
cb45b86b63 Patch torch device code here 2024-05-17 07:19:15 -07:00
doctorpangloss
4eb66f8a0a Fix clip clone bug 2024-05-17 07:17:33 -07:00
comfyanonymous
19300655dd Don't automatically switch to lowvram mode on GPUs with low memory. 2024-05-17 00:31:32 -04:00
doctorpangloss
87d1f30902 Some base nodes now have unit tests 2024-05-16 15:01:51 -07:00
doctorpangloss
3d98440fb7 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-16 14:28:49 -07:00
doctorpangloss
5a9055fe05 Tokenizers are now shallow cloned when CLIP is cloned. This allows nodes to add vocab to the tokenizer, as some checkpoints and LoRAs may require. 2024-05-16 12:39:19 -07:00
comfyanonymous
46daf0a9a7 Add debug options to force on and off attention upcasting. 2024-05-16 04:09:41 -04:00
comfyanonymous
2d41642716 Fix lowvram dora issue. 2024-05-15 02:47:40 -04:00
doctorpangloss
8741cb3ce8 LLM support in ComfyUI
- Currently uses `transformers`
 - Supports model management and correctly loading and unloading models
   based on what your machine can support
 - Includes a Text Diffusers 2 workflow to demonstrate text rendering in
   SD1.5
2024-05-14 17:30:23 -07:00
comfyanonymous
ec6f16adb6 Fix SAG. 2024-05-14 18:02:27 -04:00
comfyanonymous
bb4940d837 Only enable attention upcasting on models that actually need it. 2024-05-14 17:00:50 -04:00
comfyanonymous
b0ab31d06c Refactor attention upcasting code part 1. 2024-05-14 12:47:31 -04:00
doctorpangloss
78e340e2d8 Traces now include the arguments for executing a node, wherever it makes sense to do so. 2024-05-13 15:48:16 -07:00
doctorpangloss
d11aed87ba OpenAPI ImageRequestParameter node uses a Chrome user-agent to facilitate external URLs better 2024-05-13 15:03:34 -07:00
Simon Lui
f509c6fe21
Fix Intel GPU memory allocation accuracy and documentation update. (#3459)
* Change calculation of memory total to be more accurate, allocated is actually smaller than reserved.

* Update README.md install documentation for Intel GPUs.
2024-05-12 06:36:30 -04:00
comfyanonymous
fa6dd7e5bb Fix lowvram issue with saving checkpoints.
The previous fix didn't cover the case where the model was loaded in
lowvram mode right before.
2024-05-12 06:13:45 -04:00
comfyanonymous
49c20cdc70 No longer necessary. 2024-05-12 05:34:43 -04:00
comfyanonymous
e1489ad257 Fix issue with lowvram mode breaking model saving. 2024-05-11 21:55:20 -04:00
doctorpangloss
188eff3376 Omit spurious traces 2024-05-09 17:11:40 -07:00
doctorpangloss
779ff30c17 Provide a protocol for plugins to declare model-management-manageable models. Docs will be updated to specify that plugin authors should use ModelPatcher generally. 2024-05-09 16:07:18 -07:00
doctorpangloss
c2fa74f625 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-09 13:45:38 -07:00
doctorpangloss
881258acb6 Progress bar hooks, via the server, are now set via a context. This will be used in other places too. 2024-05-09 13:24:06 -07:00
comfyanonymous
93e876a3be Remove warnings that confuse people. 2024-05-09 05:29:42 -04:00
doctorpangloss
464c132c50 Add basic ImageRequestParameter node 2024-05-08 16:37:26 -07:00
doctorpangloss
0d8924442a Improve API return values and tracing reports 2024-05-08 15:52:17 -07:00
comfyanonymous
cd07340d96 Typo fix. 2024-05-08 18:36:56 -04:00
doctorpangloss
aa0cfb54ce Refine configuration of OpenTelemetry 2024-05-07 17:04:31 -07:00
doctorpangloss
3a64e04a93 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-07 13:57:53 -07:00
doctorpangloss
f8fcfa6f08 Improve tracing to propagate to backend workers correctly when using the API. Fix distributed tests. 2024-05-07 13:44:34 -07:00
comfyanonymous
c61eadf69a Make the load checkpoint with config function call the regular one.
I was going to completely remove this function because it is unmaintainable
but I think this is the best compromise.

The clip skip and v_prediction parts of the configs should still work but
not the fp16 vs fp32.
2024-05-06 20:04:39 -04:00
doctorpangloss
75b63fce91 Remove redudant resume_download argument 2024-05-06 10:31:58 -07:00
doctorpangloss
eb7d466b95 Fix dependency on opentelemetry instrumentor; remove websocket based API example since it isn't appropriate for this fork. 2024-05-03 16:54:33 -07:00
doctorpangloss
fd81790f12 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-05-03 06:42:56 -07:00
doctorpangloss
330ecb10b2 Merge with upstream. Remove TLS flags, because a third party proxy will do this better 2024-05-02 21:57:20 -07:00
Simon Lui
a56d02efc7
Change torch.xpu to ipex.optimize, xpu device initialization and remove workaround for text node issue from older IPEX. (#3388) 2024-05-02 03:26:50 -04:00
doctorpangloss
4d060f0555 Improve support for extra models 2024-05-01 16:58:29 -07:00
comfyanonymous
f81a6fade8 Fix some edge cases with samplers and arrays with a single sigma. 2024-05-01 17:05:30 -04:00
comfyanonymous
2aed53c4ac Workaround xformers bug. 2024-04-30 21:23:40 -04:00
Garrett Sutula
bacce529fb
Add TLS Support (#3312)
* Add TLS Support

* Add to readme

* Add guidance for windows users on generating certificates

* Add guidance for windows users on generating certificates

* Fix typo
2024-04-30 20:17:02 -04:00
doctorpangloss
b94b90c1cc Improve model downloader coherence with packages like controlnext-aux 2024-04-30 14:28:44 -07:00
doctorpangloss
0862863bc0 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-04-29 13:37:03 -07:00
doctorpangloss
46a712b4d6 Fix missing extension addition from upstream 2024-04-29 13:32:46 -07:00
Jedrzej Kosinski
7990ae18c1
Fix error when more cond masks passed in than batch size (#3353) 2024-04-26 12:51:12 -04:00
doctorpangloss
f965fb2bc0 Merge upstream 2024-04-24 22:41:43 -07:00
comfyanonymous
8dc19e40d1 Don't init a VAE model when there are no VAE weights. 2024-04-24 09:20:31 -04:00
comfyanonymous
c59fe9f254 Support VAE without quant_conv. 2024-04-18 21:05:33 -04:00
doctorpangloss
2643730acc Tracing 2024-04-17 08:20:07 -07:00
comfyanonymous
719fb2c81d Add basic PAG node. 2024-04-14 23:49:50 -04:00
comfyanonymous
258dbc06c3 Fix some memory related issues. 2024-04-14 12:08:58 -04:00
comfyanonymous
58812ab8ca Support SDXS 512 model. 2024-04-12 22:12:35 -04:00
comfyanonymous
831511a1ee Fix issue with sampling_settings persisting across models. 2024-04-09 23:20:43 -04:00
doctorpangloss
e49c662c7f Enable previews by default and over distributed channels 2024-04-09 13:15:05 -07:00
doctorpangloss
37cca051b6 Enable real-time progress notifications. 2024-04-08 14:56:16 -07:00
doctorpangloss
dd6f7c4215 Fix retrieving history from distributed instance 2024-04-08 14:39:16 -07:00
doctorpangloss
034ffcea03 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-04-08 10:02:37 -07:00
comfyanonymous
30abc324c2 Support properly saving CosXL checkpoints. 2024-04-08 00:36:22 -04:00
comfyanonymous
0a03009808 Fix issue with controlnet models getting loaded multiple times. 2024-04-06 18:38:39 -04:00
kk-89
38ed2da2dd
Fix typo in lowvram patcher (#3209) 2024-04-05 12:02:13 -04:00
comfyanonymous
1088d1850f Support for CosXL models. 2024-04-05 10:53:41 -04:00
doctorpangloss
3e002b9f72 Fix string joining node, improve model downloading 2024-04-04 23:40:29 -07:00
comfyanonymous
41ed7e85ea Fix object_patches_backup not being the same object across clones. 2024-04-05 00:22:44 -04:00
comfyanonymous
0f5768e038 Fix missing arguments in cfg_function. 2024-04-04 23:38:57 -04:00
comfyanonymous
1f4fc9ea0c Fix issue with get_model_object on patched model. 2024-04-04 23:01:02 -04:00
comfyanonymous
1a0486bb96 Fix model needing to be loaded on GPU to generate the sigmas. 2024-04-04 22:08:49 -04:00
comfyanonymous
c6bd456c45 Make zero denoise a NOP. 2024-04-04 11:41:27 -04:00
comfyanonymous
fcfd2bdf8a Small cleanup. 2024-04-04 11:16:49 -04:00
comfyanonymous
0542088ef8 Refactor sampler code for more advanced sampler nodes part 2. 2024-04-04 01:26:41 -04:00
comfyanonymous
57753c964a Refactor sampling code for more advanced sampler nodes. 2024-04-03 22:09:51 -04:00
comfyanonymous
6c6a39251f Fix saving text encoder in fp8. 2024-04-02 11:46:34 -04:00
doctorpangloss
abb952ad77 Tweak headers to accept default when none are specified 2024-04-01 20:34:55 -07:00
comfyanonymous
e6482fbbfc Refactor calc_cond_uncond_batch into calc_cond_batch.
calc_cond_batch can take an arbitrary amount of cond inputs.

Added a calc_cond_uncond_batch wrapper with a warning so custom nodes
won't break.
2024-04-01 18:07:47 -04:00
comfyanonymous
575acb69e4 IP2P model loading support.
This is the code to load the model and inference it with only a text
prompt. This commit does not contain the nodes to properly use it with an
image input.

This supports both the original SD1 instructpix2pix model and the
diffusers SDXL one.
2024-03-31 03:10:28 -04:00
Benjamin Berman
5208618681
Merge pull request #4 from cjonesuk/patch-1
Allow iteration over folder paths
2024-03-30 14:25:25 -07:00
doctorpangloss
b0ab12bf05 Fix #5 TAESD node was using a bad variable name that shadowed a module in a relative import 2024-03-29 16:28:13 -07:00
doctorpangloss
bd87697fdf ComfyUI Manager now starts successfully, but needs more mitigations:
- /manager/reboot needs to use a different approach to restart the
   currently running Python process.
 - runpy should be used for install.py invocations
2024-03-29 16:25:29 -07:00
doctorpangloss
8f548d4d19 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-03-29 13:36:57 -07:00
comfyanonymous
94a5a67c32 Cleanup to support different types of inpaint models. 2024-03-29 14:44:13 -04:00
doctorpangloss
1f705ba9d9 Fix history retrieval bug when accessing a distributed frontend 2024-03-29 10:49:51 -07:00
doctorpangloss
c6f4301e88 Fix model downloader invoking symlink when it should not 2024-03-28 15:45:04 -07:00
comfyanonymous
5d8898c056 Fix some performance issues with weight loading and unloading.
Lower peak memory usage when changing model.

Fix case where model weights would be unloaded and reloaded.
2024-03-28 18:04:42 -04:00
comfyanonymous
327ca1313d Support SDXS 0.9 2024-03-27 23:58:58 -04:00
doctorpangloss
b0be335d59 Improved support for ControlNet workflows with depth
- ComfyUI can now load EXR files.
 - There are new arithmetic nodes for floats and integers.
 - EXR nodes can load depth maps and be remapped with
   ImageApplyColormap. This allows end users to use ground truth depth
   data from video game engines or 3D graphics tools and recolor it to
   the format expected by depth ControlNets: grayscale inverse depth
   maps and "inferno" colored inverse depth maps.
 - Fixed license notes.
 - Added an additional known ControlNet model.
 - Because CV2 is now used to read OpenEXR files, an environment
   variable must be set early on in the application, before CV2 is
   imported. This file, main_pre, is now imported early on in more
   places.
2024-03-26 22:32:15 -07:00
comfyanonymous
ae77590b4e dora_scale support for lora file. 2024-03-25 18:09:23 -04:00
comfyanonymous
c6de09b02e Optimize memory unload strategy for more optimized performance. 2024-03-24 02:36:30 -04:00
doctorpangloss
d8846fcb39 Improved testing of API nodes
- dynamicPrompts now set to False by default; CLIPTextEncoder and
   related nodes now have it set to True.
 - Fixed return values of API nodes.
2024-03-22 22:04:35 -07:00
doctorpangloss
4cd8f9d2ed Merge with upstream 2024-03-22 14:35:17 -07:00
doctorpangloss
feae8c679b Add nodes to support OpenAPI and similar backend workflows 2024-03-22 14:22:50 -07:00
Christopher Jones
f8120bbd72
Allow iteration over folder paths 2024-03-22 17:01:17 +00:00
doctorpangloss
0db040cc47 Improve API support
- Removed /api/v1/images because you should use your own CDN style
   image host and /view for maximum compatibility
 - The /api/v1/prompts POST application/json response will now return
   the outputs dictionary
 - Caching has been removed
 - More tests
 - Subdirectory prefixes are now supported
 - Fixed an issue where a Linux frontend and Windows backend would have
   paths that could not interact with each other correctly
2024-03-21 16:24:22 -07:00
doctorpangloss
d73b116446 Update OpenAPI spec 2024-03-21 15:16:52 -07:00
doctorpangloss
005e370254 Merge upstream 2024-03-21 13:15:36 -07:00
comfyanonymous
0624838237 Add inverse noise scaling function. 2024-03-21 14:49:11 -04:00
doctorpangloss
59cf9e5d93 Improve distributed testing 2024-03-20 20:43:21 -07:00
comfyanonymous
5d875d77fe Fix regression with lcm not working with batches. 2024-03-20 20:48:54 -04:00
comfyanonymous
4b9005e949 Fix regression with model merging. 2024-03-20 13:56:12 -04:00
comfyanonymous
c18a203a8a Don't unload model weights for non weight patches. 2024-03-20 02:27:58 -04:00
comfyanonymous
150a3e946f Make LCM sampler use the model noise scaling function. 2024-03-20 01:35:59 -04:00
doctorpangloss
3f4049c5f4 Fix Python 3.10 compatibility detail 2024-03-19 10:48:20 -07:00
comfyanonymous
40e124c6be SV3D support. 2024-03-18 16:54:13 -04:00
doctorpangloss
74a9c45395 Fix subpath model downloads 2024-03-18 10:10:34 -07:00
comfyanonymous
cacb022c4a Make saved SD1 checkpoints match more closely the official one. 2024-03-18 00:26:23 -04:00
comfyanonymous
d7897fff2c Move cascade scale factor from stage_a to latent_formats.py 2024-03-16 14:49:35 -04:00
comfyanonymous
f2fe635c9f SamplerDPMAdaptative node to test the different options. 2024-03-15 22:36:10 -04:00
comfyanonymous
448d9263a2 Fix control loras breaking. 2024-03-14 09:30:21 -04:00
doctorpangloss
a892411cf8 Add known controlnet models and add --disable-known-models to prevent it from appearing or downloading 2024-03-13 18:11:16 -07:00
comfyanonymous
db8b59ecff Lower memory usage for loras in lowvram mode at the cost of perf. 2024-03-13 20:07:27 -04:00
doctorpangloss
341c9f2e90 Improvements to node loading, node API, folder paths and progress
- Improve node loading order. It now occurs "as late as possible".
   Configuration should be exposed as per the README.
 - Added methods to specify custom folders and models used in examples
   more robustly for custom nodes.
 - Downloading models can now be gracefully interrupted.
 - Progress notifications are now sent over the network for distributed
   ComfyUI operations.
 - Python objects have been moved around to prevent less transitive
   package importing issues.
2024-03-13 16:14:18 -07:00