Commit Graph

106 Commits

Author SHA1 Message Date
doctorpangloss
a8d8bff548 Improve support for torch compilation and sage attention 2024-10-29 19:22:26 -07:00
doctorpangloss
76a80a65ea Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-10-29 15:35:39 -07:00
comfyanonymous
f9f9faface Fixed model merging issue with scaled fp8. 2024-10-20 06:24:31 -04:00
comfyanonymous
a68bbafddb Support diffusion models with scaled fp8 weights. 2024-10-19 23:47:42 -04:00
doctorpangloss
8512f361fe Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-10-14 15:26:27 -07:00
doctorpangloss
a38968f098 Improvements to execution
- Validation errors that occur early in the lifecycle of prompt
   execution now get propagated to their callers in the
   EmbeddedComfyClient. This includes error messages about missing node
   classes.
 - The execution context now includes the node_id and the prompt_id
 - Latent previews are now sent with a node_id. This is not backwards
   compatible with old frontends.
 - Dependency execution errors are now modeled correctly.
 - Distributed progress encodes image previews with node and prompt IDs.
 - Typing for models
 - The frontend was updated to use node IDs with previews
 - Improvements to torch.compile experiments
 - Some controlnet_aux nodes were upstreamed
2024-10-10 19:30:18 -07:00
doctorpangloss
69e523b89d Experimental quantization support. Only Linux is meaningfully supported 2024-10-10 13:43:06 -07:00
doctorpangloss
bbe2ed330c Memory management and compilation improvements
- Experimental support for sage attention on Linux
 - Diffusers loader now supports model indices
 - Transformers model management now aligns with updates to ComfyUI
 - Flux layers correctly use unbind
 - Add float8 support for model loading in more places
 - Experimental quantization approaches from Quanto and torchao
 - Model upscaling interacts with memory management better

This update also disables ROCm testing because it isn't reliable enough
on consumer hardware. ROCm is not really supported by the 7600.
2024-10-09 09:13:47 -07:00
comfyanonymous
3bb4dec720 Fix issue with loras, lowvram and --fast fp8. 2024-09-28 14:42:32 -04:00
doctorpangloss
fa3176f96f Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-09-23 12:50:31 -07:00
comfyanonymous
0849c80e2a get_key_patches now works without unloading the model. 2024-09-17 01:57:59 -04:00
doctorpangloss
ffb4ed9cf2 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-09-13 12:45:23 -07:00
comfyanonymous
9d720187f1 types -> comfy_types to fix import issue. 2024-09-12 03:57:46 -04:00
doctorpangloss
8615c86722 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-08-26 16:59:38 -07:00
comfyanonymous
2ca8f6e23d Make the stochastic fp8 rounding reproducible. 2024-08-26 15:12:06 -04:00
comfyanonymous
c6812947e9 Fix potential memory leak. 2024-08-26 02:07:32 -04:00
doctorpangloss
48ca1a4910 Include Kijai fp8 nodes. LoRAs are not supported by nf4 2024-08-25 22:41:10 -07:00
doctorpangloss
5155a3e248 Merge WIP 2024-08-25 18:52:29 -07:00
comfyanonymous
7df42b9a23 Fix dora. 2024-08-23 04:58:59 -04:00
comfyanonymous
c26ca27207 Move calculate function to comfy.lora 2024-08-22 17:12:00 -04:00
comfyanonymous
7c6bb84016 Code cleanups. 2024-08-22 17:05:12 -04:00
comfyanonymous
4d341b78e8 Bug fixes. 2024-08-19 16:28:55 -04:00
comfyanonymous
6138f92084 Use better dtype for the lowvram lora system. 2024-08-19 15:35:25 -04:00
comfyanonymous
be0726c1ed Remove duplication. 2024-08-19 15:26:50 -04:00
comfyanonymous
20ace7c853 Code cleanup. 2024-08-19 12:48:59 -04:00
comfyanonymous
bb222ceddb Fix loras having a weak effect when applied on fp8. 2024-08-17 15:20:17 -04:00
comfyanonymous
cd5017c1c9 calculate_weight function to use a different dtype. 2024-08-17 01:06:08 -04:00
comfyanonymous
83f343146a Fix potential lowvram issue. 2024-08-16 17:12:42 -04:00
doctorpangloss
b0e25488dd Fix tokenizer cloning 2024-08-13 20:51:07 -07:00
doctorpangloss
0549f35e85 Merge commit '39fb74c5bd13a1dccf4d7293a2f7a755d9f43cbd' of github.com:comfyanonymous/ComfyUI
- Improvements to tests
 - Fixes model management
 - Fixes issues with language nodes
2024-08-13 20:08:56 -07:00
comfyanonymous
74e124f4d7 Fix some issues with TE being in lowvram mode. 2024-08-12 23:42:21 -04:00
comfyanonymous
5d43e75e5b Fix some issues with the model sometimes not getting patched. 2024-08-12 12:27:54 -04:00
comfyanonymous
517f4a94e4 Fix some lora loading slowdowns. 2024-08-12 11:50:32 -04:00
comfyanonymous
52a471c5c7 Change name of log. 2024-08-12 10:35:06 -04:00
comfyanonymous
1de69fe4d5 Fix some issues with inference slowing down. 2024-08-10 16:21:25 -04:00
comfyanonymous
6678d5cf65 Fix regression. 2024-08-09 14:02:38 -04:00
comfyanonymous
a3cc326748 Better fix for lowvram issue. 2024-08-09 12:16:25 -04:00
comfyanonymous
5acdadc9f3 Fix issue with some lowvram weights. 2024-08-09 03:58:28 -04:00
comfyanonymous
1e11d2d1f5 Better prints. 2024-08-08 17:29:27 -04:00
comfyanonymous
08f92d55e9 Partial model shift support. 2024-08-08 14:45:06 -04:00
comfyanonymous
cb7c4b4be3 Workaround for lora OOM on lowvram mode. 2024-08-07 14:30:54 -04:00
comfyanonymous
b334605a66 Fix OOMs happening in some cases.
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
2024-08-06 13:36:04 -04:00
doctorpangloss
72baecad87 Improve logging and tracing for validation errors 2024-07-16 12:26:30 -07:00
doctorpangloss
3d1d833e6f Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-07-15 14:22:49 -07:00
Extraltodeus
f1a01c2c7e
Add sampler_pre_cfg_function (#3979)
* Update samplers.py

* Update model_patcher.py
2024-07-09 16:20:49 -04:00
doctorpangloss
a13088ccec Merge upstream 2024-07-04 11:58:55 -07:00
comfyanonymous
73ca780019 Add SamplerEulerCFG++ node.
This node should match the DDIM implementation of CFG++ when "regular" is
selected.

"alternative" is a slightly different take on CFG++
2024-06-23 13:21:18 -04:00
comfyanonymous
028a583bef Fix issue with full diffusers SD3 loras. 2024-06-19 22:32:04 -04:00
doctorpangloss
8cdc246450 Merge branch 'master' of github.com:comfyanonymous/ComfyUI 2024-06-17 16:19:48 -07:00
comfyanonymous
0e06b370db Print key names for easier debugging. 2024-06-14 18:18:53 -04:00