patientx
e4448cf48e
Merge branch 'comfyanonymous:master' into master
2025-02-12 15:50:23 +03:00
comfyanonymous
1d5d6586f3
Fix ruff.
2025-02-12 06:49:16 -05:00
zhoufan2956
35740259de
mix_ascend_bf16_infer_err ( #6794 )
2025-02-12 06:48:11 -05:00
comfyanonymous
ab888e1e0b
Add add_weight_wrapper function to model patcher.
...
Functions can now easily be added to wrap/modify model weights.
2025-02-12 05:55:35 -05:00
comfyanonymous
d9f0fcdb0c
Cleanup.
2025-02-11 17:17:03 -05:00
HishamC
b124256817
Fix for running via DirectML ( #6542 )
...
* Fix for running via DirectML
Fix DirectML empty image generation issue with Flux1. add CPU fallback for unsupported path. Verified the model works on AMD GPUs
* fix formating
* update casual mask calculation
2025-02-11 17:11:32 -05:00
patientx
196fc385e1
Merge branch 'comfyanonymous:master' into master
2025-02-11 16:38:17 +03:00
comfyanonymous
af4b7c91be
Make --force-fp16 actually force the diffusion model to be fp16.
2025-02-11 08:33:09 -05:00
bananasss00
e57d2282d1
Fix incorrect Content-Type for WebP images ( #6752 )
2025-02-11 04:48:35 -05:00
patientx
2a0bc66fed
Merge branch 'comfyanonymous:master' into master
2025-02-10 15:41:15 +03:00
comfyanonymous
4027466c80
Make lumina model work with any latent resolution.
2025-02-10 00:24:20 -05:00
patientx
9561b03cc7
Merge branch 'comfyanonymous:master' into master
2025-02-09 15:33:03 +03:00
comfyanonymous
095d867147
Remove useless function.
2025-02-09 07:02:57 -05:00
Pam
caeb27c3a5
res_multistep: Fix cfgpp and add ancestral samplers ( #6731 )
2025-02-08 19:39:58 -05:00
comfyanonymous
3d06e1c555
Make error more clear to user.
2025-02-08 18:57:24 -05:00
catboxanon
43a74c0de1
Allow FP16 accumulation with --fast ( #6453 )
...
Currently only applies to PyTorch nightly releases. (>=20250208)
2025-02-08 17:00:56 -05:00
patientx
1d39eb81c8
Merge branch 'comfyanonymous:master' into master
2025-02-09 00:35:36 +03:00
comfyanonymous
af93c8d1ee
Document which text encoder to use for lumina 2.
2025-02-08 06:57:25 -05:00
patientx
d61d06201a
Merge branch 'comfyanonymous:master' into master
2025-02-08 02:17:20 +03:00
Raphael Walker
832e3f5ca3
Fix another small bug in attention_bias redux ( #6737 )
...
* fix a bug in the attn_masked redux code when using weight=1.0
* oh shit wait there was another bug
2025-02-07 14:44:43 -05:00
patientx
d0118796c9
Update install.bat
2025-02-07 12:49:41 +03:00
patientx
dc9f58ca65
Update patchzluda2.bat
2025-02-07 12:49:11 +03:00
patientx
1f5708663a
Update patchzluda.bat
2025-02-07 12:48:02 +03:00
patientx
9a9da027b2
Merge branch 'comfyanonymous:master' into master
2025-02-07 12:02:36 +03:00
comfyanonymous
079eccc92a
Don't compress http response by default.
...
Remove argument to disable it.
Add new --enable-compress-response-body argument to enable it.
2025-02-07 03:29:21 -05:00
patientx
1e9f6dfa8f
Merge branch 'comfyanonymous:master' into master
2025-02-07 01:22:02 +03:00
Raphael Walker
b6951768c4
fix a bug in the attn_masked redux code when using weight=1.0 ( #6721 )
2025-02-06 16:51:16 -05:00
patientx
1ef4776a96
Merge branch 'comfyanonymous:master' into master
2025-02-06 21:04:11 +03:00
Comfy Org PR Bot
fca304debf
Update frontend to v1.8.14 ( #6724 )
...
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-02-06 10:43:10 -05:00
patientx
4a1e3ee925
Merge branch 'comfyanonymous:master' into master
2025-02-06 14:33:29 +03:00
comfyanonymous
14880e6dba
Remove some useless code.
2025-02-06 05:00:37 -05:00
Chenlei Hu
f1059b0b82
Remove unused GET /files API endpoint ( #6714 )
2025-02-05 18:48:36 -05:00
comfyanonymous
debabccb84
Bump ComfyUI version to v0.3.14
2025-02-05 15:48:13 -05:00
patientx
93c0fc3446
Update supported_models.py
2025-02-05 23:12:38 +03:00
patientx
f8c2ab631a
Merge branch 'comfyanonymous:master' into master
2025-02-05 23:10:21 +03:00
comfyanonymous
37cd448529
Set the shift for Lumina back to 6.
2025-02-05 14:49:52 -05:00
comfyanonymous
94f21f9301
Upcasting rope to fp32 seems to make no difference in this model.
2025-02-05 04:32:47 -05:00
comfyanonymous
60653004e5
Use regular numbers for rope in lumina model.
2025-02-05 04:17:25 -05:00
patientx
059629a5fb
Merge branch 'comfyanonymous:master' into master
2025-02-05 10:15:14 +03:00
comfyanonymous
a57d635c5f
Fix lumina 2 batches.
2025-02-04 21:48:11 -05:00
patientx
523b5352b8
Merge branch 'comfyanonymous:master' into master
2025-02-04 18:16:21 +03:00
comfyanonymous
016b219dcc
Add Lumina Image 2.0 to Readme.
2025-02-04 08:08:36 -05:00
comfyanonymous
8ac2dddeed
Lower the default shift of lumina to reduce artifacts.
2025-02-04 06:50:37 -05:00
patientx
b8ab0f2091
Merge branch 'comfyanonymous:master' into master
2025-02-04 12:32:22 +03:00
comfyanonymous
3e880ac709
Fix on python 3.9
2025-02-04 04:20:56 -05:00
comfyanonymous
e5ea112a90
Support Lumina 2 model.
2025-02-04 04:16:30 -05:00
patientx
c525c6b8be
Merge branch 'comfyanonymous:master' into master
2025-02-04 02:24:38 +03:00
Raphael Walker
8d88bfaff9
allow searching for new .pt2 extension, which can contain AOTI compiled modules ( #6689 )
2025-02-03 17:07:35 -05:00
patientx
c4e112cf10
Merge branch 'comfyanonymous:master' into master
2025-02-03 13:54:53 +03:00
comfyanonymous
ed4d92b721
Model merging nodes for cosmos.
2025-02-03 03:31:39 -05:00