Commit Graph

1150 Commits

Author SHA1 Message Date
patientx
8bcf408c79
Merge branch 'comfyanonymous:master' into master 2024-09-15 16:41:23 +03:00
comfyanonymous
e813abbb2c Long CLIP L support for SDXL, SD3 and Flux.
Use the *CLIPLoader nodes.
2024-09-15 07:59:38 -04:00
patientx
b9a24c0146
Merge branch 'comfyanonymous:master' into master 2024-09-14 16:14:57 +03:00
comfyanonymous
f48e390032 Support AliMama SD3 and Flux inpaint controlnets.
Use the ControlNetInpaintingAliMamaApply node.
2024-09-14 09:05:16 -04:00
patientx
2710ea1aa2
Merge branch 'comfyanonymous:master' into master 2024-09-13 23:07:04 +03:00
comfyanonymous
cf80d28689 Support loading controlnets with different input. 2024-09-13 09:54:37 -04:00
patientx
8ca077b268
Merge branch 'comfyanonymous:master' into master 2024-09-12 16:48:54 +03:00
Robin Huang
b962db9952
Add cli arg to override user directory (#4856)
* Override user directory.

* Use overridden user directory.

* Remove prints.

* Remove references to global user_files.

* Remove unused replace_folder function.

* Remove newline.

* Remove global during get_user_directory.

* Add validation.
2024-09-12 08:10:27 -04:00
patientx
73c31987c6
Merge branch 'comfyanonymous:master' into master 2024-09-12 12:44:27 +03:00
comfyanonymous
9d720187f1 types -> comfy_types to fix import issue. 2024-09-12 03:57:46 -04:00
patientx
8eb7ca051a
Merge branch 'comfyanonymous:master' into master 2024-09-11 10:26:46 +03:00
comfyanonymous
9f4daca9d9 Doesn't really make sense for cfg_pp sampler to call regular one. 2024-09-11 02:51:36 -04:00
yoinked
b5d0f2a908
Add CFG++ to DPM++ 2S Ancestral (#3871)
* Update sampling.py

* Update samplers.py

* my bad

* "fix" the sampler

* Update samplers.py

* i named it wrong

* minor sampling improvements

mainly using a dynamic rho value (hey this sounds a lot like smea!!!)

* revert rho change

rho? r? its just 1/2
2024-09-11 02:49:44 -04:00
patientx
c4e18b7206
Merge branch 'comfyanonymous:master' into master 2024-09-08 22:20:50 +03:00
comfyanonymous
9c5fca75f4 Fix lora issue. 2024-09-08 10:10:47 -04:00
comfyanonymous
32a60a7bac Support onetrainer text encoder Flux lora. 2024-09-08 09:31:41 -04:00
patientx
52f858d715
Merge branch 'comfyanonymous:master' into master 2024-09-07 14:47:35 +03:00
Jim Winkens
bb52934ba4
Fix import issue (#4815) 2024-09-07 05:28:32 -04:00
patientx
962638c9dc
Merge branch 'comfyanonymous:master' into master 2024-09-07 11:04:57 +03:00
comfyanonymous
ea77750759 Support a generic Comfy format for text encoder loras.
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight

Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.

If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
patientx
bc054d012b
Merge branch 'comfyanonymous:master' into master 2024-09-06 10:58:13 +03:00
comfyanonymous
c27ebeb1c2 Fix onnx export not working on flux. 2024-09-06 03:21:52 -04:00
patientx
6fdbaf1a76
Merge branch 'comfyanonymous:master' into master 2024-09-05 12:04:05 +03:00
comfyanonymous
5cbaa9e07c Mistoline flux controlnet support. 2024-09-05 00:05:17 -04:00
comfyanonymous
c7427375ee Prioritize freeing partially offloaded models first. 2024-09-04 19:47:32 -04:00
patientx
894c727ce2
Update model_management.py 2024-09-05 00:05:54 +03:00
patientx
b518390241
Merge branch 'comfyanonymous:master' into master 2024-09-04 22:36:12 +03:00
Jedrzej Kosinski
f04229b84d
Add emb_patch support to UNetModel forward (#4779) 2024-09-04 14:35:15 -04:00
patientx
64f428801e
Merge branch 'comfyanonymous:master' into master 2024-09-04 09:29:56 +03:00
Silver
f067ad15d1
Make live preview size a configurable launch argument (#4649)
* Make live preview size a configurable launch argument

* Remove import from testing phase

* Update cli_args.py
2024-09-03 19:16:38 -04:00
comfyanonymous
483004dd1d Support newer glora format. 2024-09-03 17:02:19 -04:00
patientx
88ccc8f3a5
Merge branch 'comfyanonymous:master' into master 2024-09-03 11:01:28 +03:00
comfyanonymous
00a5d08103 Lower fp8 lora memory usage. 2024-09-03 01:25:05 -04:00
patientx
f2122a355b
Merge branch 'comfyanonymous:master' into master 2024-09-02 16:06:23 +03:00
comfyanonymous
d043997d30 Flux onetrainer lora. 2024-09-02 08:22:15 -04:00
patientx
93fa5c9ebb
Merge branch 'comfyanonymous:master' into master 2024-09-02 10:03:48 +03:00
comfyanonymous
8d31a6632f Speed up inference on nvidia 10 series on Linux. 2024-09-01 17:29:31 -04:00
patientx
f02c0d3ed9
Merge branch 'comfyanonymous:master' into master 2024-09-01 14:34:56 +03:00
comfyanonymous
b643eae08b Make minimum_inference_memory() depend on --reserve-vram 2024-09-01 01:18:34 -04:00
patientx
acc3d6a2ea
Update model_management.py 2024-08-30 20:13:28 +03:00
patientx
51af2440ef
Update model_management.py 2024-08-30 20:10:47 +03:00
patientx
3e226f02f3
Update model_management.py 2024-08-30 20:08:18 +03:00
comfyanonymous
935ae153e1 Cleanup. 2024-08-30 12:53:59 -04:00
patientx
aeab6d1370
Merge branch 'comfyanonymous:master' into master 2024-08-30 19:49:03 +03:00
Chenlei Hu
e91662e784
Get logs endpoint & system_stats additions (#4690)
* Add route for getting output logs

* Include ComfyUI version

* Move to own function

* Changed to memory logger

* Unify logger setup logic

* Fix get version git fallback

---------

Co-authored-by: pythongosssss <125205205+pythongosssss@users.noreply.github.com>
2024-08-30 12:46:37 -04:00
patientx
d8c04b9022
Merge branch 'comfyanonymous:master' into master 2024-08-30 19:42:36 +03:00
patientx
524cd140b5
removed bfloat from flux model support, resulting in 2x speedup 2024-08-30 13:33:32 +03:00
patientx
a8652a052f
Merge branch 'comfyanonymous:master' into master 2024-08-30 12:14:01 +03:00
comfyanonymous
63fafaef45 Fix potential issue with hydit controlnets. 2024-08-30 04:58:41 -04:00
comfyanonymous
6eb5d64522 Fix glora lowvram issue. 2024-08-29 19:07:23 -04:00