Commit Graph

1160 Commits

Author SHA1 Message Date
patientx
d3c8252c48
Merge branch 'comfyanonymous:master' into master 2024-09-20 10:14:16 +03:00
comfyanonymous
70a708d726 Fix model merging issue. 2024-09-20 02:31:44 -04:00
yoinked
e7d4782736
add laplace scheduler [2407.03297] (#4990)
* add laplace scheduler [2407.03297]

* should be here instead lol

* better settings
2024-09-19 23:23:09 -04:00
patientx
9e8686df8d
Merge branch 'comfyanonymous:master' into master 2024-09-19 19:57:21 +03:00
comfyanonymous
ad66f7c7d8 Add model_options to load_controlnet function. 2024-09-19 08:23:35 -04:00
Simon Lui
de8e8e3b0d
Fix xpu Pytorch nightly build from calling optimize which doesn't exist. (#4978) 2024-09-19 05:11:42 -04:00
patientx
4c62b6d8f0
Merge branch 'comfyanonymous:master' into master 2024-09-17 11:02:10 +03:00
pharmapsychotic
0b7dfa986d
Improve tiling calculations to reduce number of tiles that need to be processed. (#4944) 2024-09-17 03:51:10 -04:00
comfyanonymous
d514bb38ee Add some option to model_options for the text encoder.
load_device, offload_device and the initial_device can now be set.
2024-09-17 03:49:54 -04:00
comfyanonymous
0849c80e2a get_key_patches now works without unloading the model. 2024-09-17 01:57:59 -04:00
patientx
8bcf408c79
Merge branch 'comfyanonymous:master' into master 2024-09-15 16:41:23 +03:00
comfyanonymous
e813abbb2c Long CLIP L support for SDXL, SD3 and Flux.
Use the *CLIPLoader nodes.
2024-09-15 07:59:38 -04:00
patientx
b9a24c0146
Merge branch 'comfyanonymous:master' into master 2024-09-14 16:14:57 +03:00
comfyanonymous
f48e390032 Support AliMama SD3 and Flux inpaint controlnets.
Use the ControlNetInpaintingAliMamaApply node.
2024-09-14 09:05:16 -04:00
patientx
2710ea1aa2
Merge branch 'comfyanonymous:master' into master 2024-09-13 23:07:04 +03:00
comfyanonymous
cf80d28689 Support loading controlnets with different input. 2024-09-13 09:54:37 -04:00
patientx
8ca077b268
Merge branch 'comfyanonymous:master' into master 2024-09-12 16:48:54 +03:00
Robin Huang
b962db9952
Add cli arg to override user directory (#4856)
* Override user directory.

* Use overridden user directory.

* Remove prints.

* Remove references to global user_files.

* Remove unused replace_folder function.

* Remove newline.

* Remove global during get_user_directory.

* Add validation.
2024-09-12 08:10:27 -04:00
patientx
73c31987c6
Merge branch 'comfyanonymous:master' into master 2024-09-12 12:44:27 +03:00
comfyanonymous
9d720187f1 types -> comfy_types to fix import issue. 2024-09-12 03:57:46 -04:00
patientx
8eb7ca051a
Merge branch 'comfyanonymous:master' into master 2024-09-11 10:26:46 +03:00
comfyanonymous
9f4daca9d9 Doesn't really make sense for cfg_pp sampler to call regular one. 2024-09-11 02:51:36 -04:00
yoinked
b5d0f2a908
Add CFG++ to DPM++ 2S Ancestral (#3871)
* Update sampling.py

* Update samplers.py

* my bad

* "fix" the sampler

* Update samplers.py

* i named it wrong

* minor sampling improvements

mainly using a dynamic rho value (hey this sounds a lot like smea!!!)

* revert rho change

rho? r? its just 1/2
2024-09-11 02:49:44 -04:00
patientx
c4e18b7206
Merge branch 'comfyanonymous:master' into master 2024-09-08 22:20:50 +03:00
comfyanonymous
9c5fca75f4 Fix lora issue. 2024-09-08 10:10:47 -04:00
comfyanonymous
32a60a7bac Support onetrainer text encoder Flux lora. 2024-09-08 09:31:41 -04:00
patientx
52f858d715
Merge branch 'comfyanonymous:master' into master 2024-09-07 14:47:35 +03:00
Jim Winkens
bb52934ba4
Fix import issue (#4815) 2024-09-07 05:28:32 -04:00
patientx
962638c9dc
Merge branch 'comfyanonymous:master' into master 2024-09-07 11:04:57 +03:00
comfyanonymous
ea77750759 Support a generic Comfy format for text encoder loras.
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight

Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.

If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
patientx
bc054d012b
Merge branch 'comfyanonymous:master' into master 2024-09-06 10:58:13 +03:00
comfyanonymous
c27ebeb1c2 Fix onnx export not working on flux. 2024-09-06 03:21:52 -04:00
patientx
6fdbaf1a76
Merge branch 'comfyanonymous:master' into master 2024-09-05 12:04:05 +03:00
comfyanonymous
5cbaa9e07c Mistoline flux controlnet support. 2024-09-05 00:05:17 -04:00
comfyanonymous
c7427375ee Prioritize freeing partially offloaded models first. 2024-09-04 19:47:32 -04:00
patientx
894c727ce2
Update model_management.py 2024-09-05 00:05:54 +03:00
patientx
b518390241
Merge branch 'comfyanonymous:master' into master 2024-09-04 22:36:12 +03:00
Jedrzej Kosinski
f04229b84d
Add emb_patch support to UNetModel forward (#4779) 2024-09-04 14:35:15 -04:00
patientx
64f428801e
Merge branch 'comfyanonymous:master' into master 2024-09-04 09:29:56 +03:00
Silver
f067ad15d1
Make live preview size a configurable launch argument (#4649)
* Make live preview size a configurable launch argument

* Remove import from testing phase

* Update cli_args.py
2024-09-03 19:16:38 -04:00
comfyanonymous
483004dd1d Support newer glora format. 2024-09-03 17:02:19 -04:00
patientx
88ccc8f3a5
Merge branch 'comfyanonymous:master' into master 2024-09-03 11:01:28 +03:00
comfyanonymous
00a5d08103 Lower fp8 lora memory usage. 2024-09-03 01:25:05 -04:00
patientx
f2122a355b
Merge branch 'comfyanonymous:master' into master 2024-09-02 16:06:23 +03:00
comfyanonymous
d043997d30 Flux onetrainer lora. 2024-09-02 08:22:15 -04:00
patientx
93fa5c9ebb
Merge branch 'comfyanonymous:master' into master 2024-09-02 10:03:48 +03:00
comfyanonymous
8d31a6632f Speed up inference on nvidia 10 series on Linux. 2024-09-01 17:29:31 -04:00
patientx
f02c0d3ed9
Merge branch 'comfyanonymous:master' into master 2024-09-01 14:34:56 +03:00
comfyanonymous
b643eae08b Make minimum_inference_memory() depend on --reserve-vram 2024-09-01 01:18:34 -04:00
patientx
acc3d6a2ea
Update model_management.py 2024-08-30 20:13:28 +03:00