patientx
962638c9dc
Merge branch 'comfyanonymous:master' into master
2024-09-07 11:04:57 +03:00
comfyanonymous
8aabd7c8c0
SaveLora node can now save "full diff" lora format.
...
This isn't actually a lora format and is saving the full diff of the
weights in a format that can be used in the lora loader nodes.
2024-09-07 03:21:02 -04:00
comfyanonymous
a09b29ca11
Add an option to the SaveLora node to store the bias diff.
2024-09-07 03:03:30 -04:00
comfyanonymous
9bfee68773
LoraSave node now supports generating text encoder loras.
...
text_encoder_diff should be connected to a CLIPMergeSubtract node.
model_diff and text_encoder_diff are optional inputs so you can create
model only loras, text encoder only loras or a lora that contains both.
2024-09-07 02:30:12 -04:00
comfyanonymous
ea77750759
Support a generic Comfy format for text encoder loras.
...
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight
Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.
If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
patientx
bc054d012b
Merge branch 'comfyanonymous:master' into master
2024-09-06 10:58:13 +03:00
comfyanonymous
c27ebeb1c2
Fix onnx export not working on flux.
2024-09-06 03:21:52 -04:00
patientx
7a83b53df1
Update README.md
2024-09-06 09:53:12 +03:00
patientx
9b8cedb3c7
Merge branch 'comfyanonymous:master' into master
2024-09-06 09:52:25 +03:00
patientx
f60dd248f2
Update README.md
2024-09-06 09:52:18 +03:00
guill
0c7c98a965
Nodes using UNIQUE_ID as input are NOT_IDEMPOTENT ( #4793 )
...
As suggested by @ltdrdata, we can automatically consider nodes that take
the UNIQUE_ID hidden input to be NOT_IDEMPOTENT.
2024-09-05 19:33:02 -04:00
comfyanonymous
dc2eb75b85
Update stable release workflow to latest pytorch with cuda 12.4.
2024-09-05 19:21:52 -04:00
Chenlei Hu
fa34efe3bd
Update frontend to v1.2.47 ( #4798 )
...
* Update web content to release v1.2.47
* Update shortcut list
2024-09-05 18:56:01 -04:00
patientx
6fdbaf1a76
Merge branch 'comfyanonymous:master' into master
2024-09-05 12:04:05 +03:00
comfyanonymous
5cbaa9e07c
Mistoline flux controlnet support.
2024-09-05 00:05:17 -04:00
comfyanonymous
c7427375ee
Prioritize freeing partially offloaded models first.
2024-09-04 19:47:32 -04:00
patientx
894c727ce2
Update model_management.py
2024-09-05 00:05:54 +03:00
patientx
1b7b7ad6f1
Merge branch 'comfyanonymous:master' into master
2024-09-05 00:01:19 +03:00
comfyanonymous
22d1241a50
Add an experimental LoraSave node to extract model loras.
...
The model_diff input should be connected to the output of a
ModelMergeSubtract node.
2024-09-04 16:38:38 -04:00
patientx
b518390241
Merge branch 'comfyanonymous:master' into master
2024-09-04 22:36:12 +03:00
Jedrzej Kosinski
f04229b84d
Add emb_patch support to UNetModel forward ( #4779 )
2024-09-04 14:35:15 -04:00
patientx
64f428801e
Merge branch 'comfyanonymous:master' into master
2024-09-04 09:29:56 +03:00
Silver
f067ad15d1
Make live preview size a configurable launch argument ( #4649 )
...
* Make live preview size a configurable launch argument
* Remove import from testing phase
* Update cli_args.py
2024-09-03 19:16:38 -04:00
comfyanonymous
483004dd1d
Support newer glora format.
2024-09-03 17:02:19 -04:00
patientx
0ad3f385b4
Update start.bat
2024-09-03 15:23:59 +03:00
patientx
88ccc8f3a5
Merge branch 'comfyanonymous:master' into master
2024-09-03 11:01:28 +03:00
comfyanonymous
00a5d08103
Lower fp8 lora memory usage.
2024-09-03 01:25:05 -04:00
patientx
f2122a355b
Merge branch 'comfyanonymous:master' into master
2024-09-02 16:06:23 +03:00
comfyanonymous
d043997d30
Flux onetrainer lora.
2024-09-02 08:22:15 -04:00
patientx
93fa5c9ebb
Merge branch 'comfyanonymous:master' into master
2024-09-02 10:03:48 +03:00
Alex "mcmonkey" Goodwin
f1c2301697
fix typo in stale-issues ( #4735 )
2024-09-01 17:44:49 -04:00
comfyanonymous
8d31a6632f
Speed up inference on nvidia 10 series on Linux.
2024-09-01 17:29:31 -04:00
patientx
f02c0d3ed9
Merge branch 'comfyanonymous:master' into master
2024-09-01 14:34:56 +03:00
comfyanonymous
b643eae08b
Make minimum_inference_memory() depend on --reserve-vram
2024-09-01 01:18:34 -04:00
patientx
57f0e71a0f
Update README.md
2024-08-31 14:04:19 +03:00
patientx
03085fbe61
Merge branch 'comfyanonymous:master' into master
2024-08-31 14:03:34 +03:00
patientx
3d8b5c3739
Update README.md
2024-08-31 14:03:28 +03:00
comfyanonymous
baa6b4dc36
Update manual install instructions.
2024-08-31 04:37:23 -04:00
Alex "mcmonkey" Goodwin
d4aeefc297
add github action to automatically handle stale user support issues ( #4683 )
...
* add github action to automatically handle stale user support issues
* improve stale message
* remove token part
2024-08-31 01:57:18 -04:00
comfyanonymous
587e7ca654
Remove github buttons.
2024-08-31 01:53:10 -04:00
Chenlei Hu
c90459eba0
Update ComfyUI_frontend to 1.2.40 ( #4691 )
...
* Update ComfyUI_frontend to 1.2.40
* Add files
2024-08-30 19:32:10 -04:00
Vedat Baday
04278afb10
feat: return import_failed from init_extra_nodes function ( #4694 )
2024-08-30 19:26:47 -04:00
patientx
acc3d6a2ea
Update model_management.py
2024-08-30 20:13:28 +03:00
patientx
fa38827a53
Merge branch 'comfyanonymous:master' into master
2024-08-30 20:10:54 +03:00
patientx
51af2440ef
Update model_management.py
2024-08-30 20:10:47 +03:00
patientx
c371e1da3a
Update server.py
2024-08-30 20:09:31 +03:00
patientx
3e226f02f3
Update model_management.py
2024-08-30 20:08:18 +03:00
comfyanonymous
935ae153e1
Cleanup.
2024-08-30 12:53:59 -04:00
patientx
aeab6d1370
Merge branch 'comfyanonymous:master' into master
2024-08-30 19:49:03 +03:00
Chenlei Hu
e91662e784
Get logs endpoint & system_stats additions ( #4690 )
...
* Add route for getting output logs
* Include ComfyUI version
* Move to own function
* Changed to memory logger
* Unify logger setup logic
* Fix get version git fallback
---------
Co-authored-by: pythongosssss <125205205+pythongosssss@users.noreply.github.com>
2024-08-30 12:46:37 -04:00