doctorpangloss
bbe2ed330c
Memory management and compilation improvements
...
- Experimental support for sage attention on Linux
- Diffusers loader now supports model indices
- Transformers model management now aligns with updates to ComfyUI
- Flux layers correctly use unbind
- Add float8 support for model loading in more places
- Experimental quantization approaches from Quanto and torchao
- Model upscaling interacts with memory management better
This update also disables ROCm testing because it isn't reliable enough
on consumer hardware. ROCm is not really supported by the 7600.
2024-10-09 09:13:47 -07:00
doctorpangloss
fa3176f96f
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-09-23 12:50:31 -07:00
comfyanonymous
0849c80e2a
get_key_patches now works without unloading the model.
2024-09-17 01:57:59 -04:00
doctorpangloss
ffb4ed9cf2
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-09-13 12:45:23 -07:00
comfyanonymous
9d720187f1
types -> comfy_types to fix import issue.
2024-09-12 03:57:46 -04:00
doctorpangloss
8615c86722
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-08-26 16:59:38 -07:00
comfyanonymous
2ca8f6e23d
Make the stochastic fp8 rounding reproducible.
2024-08-26 15:12:06 -04:00
comfyanonymous
c6812947e9
Fix potential memory leak.
2024-08-26 02:07:32 -04:00
doctorpangloss
48ca1a4910
Include Kijai fp8 nodes. LoRAs are not supported by nf4
2024-08-25 22:41:10 -07:00
doctorpangloss
5155a3e248
Merge WIP
2024-08-25 18:52:29 -07:00
comfyanonymous
7df42b9a23
Fix dora.
2024-08-23 04:58:59 -04:00
comfyanonymous
c26ca27207
Move calculate function to comfy.lora
2024-08-22 17:12:00 -04:00
comfyanonymous
7c6bb84016
Code cleanups.
2024-08-22 17:05:12 -04:00
comfyanonymous
4d341b78e8
Bug fixes.
2024-08-19 16:28:55 -04:00
comfyanonymous
6138f92084
Use better dtype for the lowvram lora system.
2024-08-19 15:35:25 -04:00
comfyanonymous
be0726c1ed
Remove duplication.
2024-08-19 15:26:50 -04:00
comfyanonymous
20ace7c853
Code cleanup.
2024-08-19 12:48:59 -04:00
comfyanonymous
bb222ceddb
Fix loras having a weak effect when applied on fp8.
2024-08-17 15:20:17 -04:00
comfyanonymous
cd5017c1c9
calculate_weight function to use a different dtype.
2024-08-17 01:06:08 -04:00
comfyanonymous
83f343146a
Fix potential lowvram issue.
2024-08-16 17:12:42 -04:00
doctorpangloss
b0e25488dd
Fix tokenizer cloning
2024-08-13 20:51:07 -07:00
doctorpangloss
0549f35e85
Merge commit '39fb74c5bd13a1dccf4d7293a2f7a755d9f43cbd' of github.com:comfyanonymous/ComfyUI
...
- Improvements to tests
- Fixes model management
- Fixes issues with language nodes
2024-08-13 20:08:56 -07:00
comfyanonymous
74e124f4d7
Fix some issues with TE being in lowvram mode.
2024-08-12 23:42:21 -04:00
comfyanonymous
5d43e75e5b
Fix some issues with the model sometimes not getting patched.
2024-08-12 12:27:54 -04:00
comfyanonymous
517f4a94e4
Fix some lora loading slowdowns.
2024-08-12 11:50:32 -04:00
comfyanonymous
52a471c5c7
Change name of log.
2024-08-12 10:35:06 -04:00
comfyanonymous
1de69fe4d5
Fix some issues with inference slowing down.
2024-08-10 16:21:25 -04:00
comfyanonymous
6678d5cf65
Fix regression.
2024-08-09 14:02:38 -04:00
comfyanonymous
a3cc326748
Better fix for lowvram issue.
2024-08-09 12:16:25 -04:00
comfyanonymous
5acdadc9f3
Fix issue with some lowvram weights.
2024-08-09 03:58:28 -04:00
comfyanonymous
1e11d2d1f5
Better prints.
2024-08-08 17:29:27 -04:00
comfyanonymous
08f92d55e9
Partial model shift support.
2024-08-08 14:45:06 -04:00
comfyanonymous
cb7c4b4be3
Workaround for lora OOM on lowvram mode.
2024-08-07 14:30:54 -04:00
comfyanonymous
b334605a66
Fix OOMs happening in some cases.
...
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
2024-08-06 13:36:04 -04:00
doctorpangloss
72baecad87
Improve logging and tracing for validation errors
2024-07-16 12:26:30 -07:00
doctorpangloss
3d1d833e6f
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-07-15 14:22:49 -07:00
Extraltodeus
f1a01c2c7e
Add sampler_pre_cfg_function ( #3979 )
...
* Update samplers.py
* Update model_patcher.py
2024-07-09 16:20:49 -04:00
doctorpangloss
a13088ccec
Merge upstream
2024-07-04 11:58:55 -07:00
comfyanonymous
73ca780019
Add SamplerEulerCFG++ node.
...
This node should match the DDIM implementation of CFG++ when "regular" is
selected.
"alternative" is a slightly different take on CFG++
2024-06-23 13:21:18 -04:00
comfyanonymous
028a583bef
Fix issue with full diffusers SD3 loras.
2024-06-19 22:32:04 -04:00
doctorpangloss
8cdc246450
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-06-17 16:19:48 -07:00
comfyanonymous
0e06b370db
Print key names for easier debugging.
2024-06-14 18:18:53 -04:00
Max Tretikov
2f12a8a790
Ensure patch_data is defined
2024-06-13 15:32:09 -06:00
comfyanonymous
37a08a41b3
Support setting weight offsets in weight patcher.
2024-06-13 17:21:26 -04:00
doctorpangloss
3f559135c6
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-06-03 11:42:55 -07:00
comfyanonymous
809cc85a8e
Remove useless code.
2024-06-02 19:23:37 -04:00
doctorpangloss
cb557c960b
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-05-31 07:42:11 -07:00
comfyanonymous
ffc4b7c30e
Fix DORA strength.
...
This is a different version of #3298 with more correct behavior.
2024-05-25 02:50:11 -04:00
comfyanonymous
efa5a711b2
Reduce memory usage when applying DORA: #3557
2024-05-24 23:36:48 -04:00
Chenlei Hu
7718ada4ed
Add type annotation UnetWrapperFunction ( #3531 )
...
* Add type annotation UnetWrapperFunction
* nit
* Add types.py
2024-05-22 02:07:27 -04:00