Commit Graph

3562 Commits

Author SHA1 Message Date
patientx
2a0bc66fed
Merge branch 'comfyanonymous:master' into master 2025-02-10 15:41:15 +03:00
comfyanonymous
4027466c80 Make lumina model work with any latent resolution. 2025-02-10 00:24:20 -05:00
patientx
9561b03cc7
Merge branch 'comfyanonymous:master' into master 2025-02-09 15:33:03 +03:00
comfyanonymous
095d867147 Remove useless function. 2025-02-09 07:02:57 -05:00
Pam
caeb27c3a5
res_multistep: Fix cfgpp and add ancestral samplers (#6731) 2025-02-08 19:39:58 -05:00
comfyanonymous
3d06e1c555 Make error more clear to user. 2025-02-08 18:57:24 -05:00
catboxanon
43a74c0de1
Allow FP16 accumulation with --fast (#6453)
Currently only applies to PyTorch nightly releases. (>=20250208)
2025-02-08 17:00:56 -05:00
patientx
1d39eb81c8
Merge branch 'comfyanonymous:master' into master 2025-02-09 00:35:36 +03:00
comfyanonymous
af93c8d1ee Document which text encoder to use for lumina 2. 2025-02-08 06:57:25 -05:00
patientx
d61d06201a
Merge branch 'comfyanonymous:master' into master 2025-02-08 02:17:20 +03:00
Raphael Walker
832e3f5ca3
Fix another small bug in attention_bias redux (#6737)
* fix a bug in the attn_masked redux code when using weight=1.0

* oh shit wait there was another bug
2025-02-07 14:44:43 -05:00
patientx
d0118796c9
Update install.bat 2025-02-07 12:49:41 +03:00
patientx
dc9f58ca65
Update patchzluda2.bat 2025-02-07 12:49:11 +03:00
patientx
1f5708663a
Update patchzluda.bat 2025-02-07 12:48:02 +03:00
patientx
9a9da027b2
Merge branch 'comfyanonymous:master' into master 2025-02-07 12:02:36 +03:00
comfyanonymous
079eccc92a Don't compress http response by default.
Remove argument to disable it.

Add new --enable-compress-response-body argument to enable it.
2025-02-07 03:29:21 -05:00
patientx
1e9f6dfa8f
Merge branch 'comfyanonymous:master' into master 2025-02-07 01:22:02 +03:00
Raphael Walker
b6951768c4
fix a bug in the attn_masked redux code when using weight=1.0 (#6721) 2025-02-06 16:51:16 -05:00
patientx
1ef4776a96
Merge branch 'comfyanonymous:master' into master 2025-02-06 21:04:11 +03:00
Comfy Org PR Bot
fca304debf
Update frontend to v1.8.14 (#6724)
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-02-06 10:43:10 -05:00
patientx
4a1e3ee925
Merge branch 'comfyanonymous:master' into master 2025-02-06 14:33:29 +03:00
comfyanonymous
14880e6dba Remove some useless code. 2025-02-06 05:00:37 -05:00
Chenlei Hu
f1059b0b82
Remove unused GET /files API endpoint (#6714) 2025-02-05 18:48:36 -05:00
comfyanonymous
debabccb84 Bump ComfyUI version to v0.3.14 2025-02-05 15:48:13 -05:00
patientx
93c0fc3446
Update supported_models.py 2025-02-05 23:12:38 +03:00
patientx
f8c2ab631a
Merge branch 'comfyanonymous:master' into master 2025-02-05 23:10:21 +03:00
comfyanonymous
37cd448529 Set the shift for Lumina back to 6. 2025-02-05 14:49:52 -05:00
comfyanonymous
94f21f9301 Upcasting rope to fp32 seems to make no difference in this model. 2025-02-05 04:32:47 -05:00
comfyanonymous
60653004e5 Use regular numbers for rope in lumina model. 2025-02-05 04:17:25 -05:00
patientx
059629a5fb
Merge branch 'comfyanonymous:master' into master 2025-02-05 10:15:14 +03:00
comfyanonymous
a57d635c5f Fix lumina 2 batches. 2025-02-04 21:48:11 -05:00
patientx
523b5352b8
Merge branch 'comfyanonymous:master' into master 2025-02-04 18:16:21 +03:00
comfyanonymous
016b219dcc Add Lumina Image 2.0 to Readme. 2025-02-04 08:08:36 -05:00
comfyanonymous
8ac2dddeed Lower the default shift of lumina to reduce artifacts. 2025-02-04 06:50:37 -05:00
patientx
b8ab0f2091
Merge branch 'comfyanonymous:master' into master 2025-02-04 12:32:22 +03:00
comfyanonymous
3e880ac709 Fix on python 3.9 2025-02-04 04:20:56 -05:00
comfyanonymous
e5ea112a90 Support Lumina 2 model. 2025-02-04 04:16:30 -05:00
patientx
c525c6b8be
Merge branch 'comfyanonymous:master' into master 2025-02-04 02:24:38 +03:00
Raphael Walker
8d88bfaff9
allow searching for new .pt2 extension, which can contain AOTI compiled modules (#6689) 2025-02-03 17:07:35 -05:00
patientx
c4e112cf10
Merge branch 'comfyanonymous:master' into master 2025-02-03 13:54:53 +03:00
comfyanonymous
ed4d92b721 Model merging nodes for cosmos. 2025-02-03 03:31:39 -05:00
patientx
e5c7b07e6a
Merge branch 'comfyanonymous:master' into master 2025-02-03 02:49:41 +03:00
Comfy Org PR Bot
932ae8d9ca
Update frontend to v1.8.13 (#6682)
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-02-02 17:54:44 -05:00
patientx
bf081a208a
Merge branch 'comfyanonymous:master' into master 2025-02-02 18:27:25 +03:00
comfyanonymous
44e19a28d3 Use maximum negative value instead of -inf for masks in text encoders.
This is probably more correct.
2025-02-02 09:46:00 -05:00
patientx
a5eb46e557
Merge branch 'comfyanonymous:master' into master 2025-02-02 17:30:16 +03:00
Dr.Lt.Data
0a0df5f136
better guide message for sageattention (#6634) 2025-02-02 09:26:47 -05:00
KarryCharon
24d6871e47
add disable-compres-response-body cli args; add compress middleware; (#6672) 2025-02-02 09:24:55 -05:00
patientx
2ccb7dd301
Merge branch 'comfyanonymous:master' into master 2025-02-01 15:33:58 +03:00
comfyanonymous
9e1d301129 Only use stable cascade lora format with cascade model. 2025-02-01 06:35:22 -05:00