patientx
0143b19681
Merge branch 'comfyanonymous:master' into master
2025-04-20 15:05:36 +03:00
comfyanonymous
fd27494441
Use empty t5 of size 128 for hidream, seems to give closer results.
2025-04-19 19:49:40 -04:00
power88
f43e1d7f41
Hidream: Allow loading hidream text encoders in CLIPLoader and DualCLIPLoader ( #7676 )
...
* Hidream: Allow partial loading text encoders
* reformat code for ruff check.
2025-04-19 19:47:30 -04:00
Yoland Yan
4486b0d0ff
Update CODEOWNERS and add christian-byrne ( #7663 )
2025-04-19 17:23:31 -04:00
patientx
cd77ded0c7
Merge branch 'comfyanonymous:master' into master
2025-04-19 23:20:10 +03:00
comfyanonymous
636d4bfb89
Fix hard crash when the spiece tokenizer path is bad.
2025-04-19 15:55:43 -04:00
Robin Huang
dc300a4569
Add wanfun template workflows. ( #7678 )
2025-04-19 15:21:46 -04:00
patientx
e13cbbeeb9
Merge branch 'comfyanonymous:master' into master
2025-04-19 00:54:07 +03:00
patientx
682a70bcf9
Update zluda.py
2025-04-19 00:53:59 +03:00
Chenlei Hu
f3b09b9f2d
[BugFix] Update frontend to 1.16.9 ( #7655 )
...
Backport https://github.com/Comfy-Org/ComfyUI_frontend/pull/3505
2025-04-18 15:12:42 -04:00
patientx
159f3460ba
Merge branch 'comfyanonymous:master' into master
2025-04-18 15:40:02 +03:00
comfyanonymous
7ecd5e9614
Increase freq_cutoff in FreSca node.
2025-04-18 03:16:16 -04:00
City
2383a39e3b
Replace CLIPType if with getattr ( #7589 )
...
* Replace CLIPType if with getattr
* Forgot to remove breakpoint from testing
2025-04-18 02:53:36 -04:00
Terry Jia
34e06bf7ec
add support to output camera state ( #7582 )
2025-04-18 02:52:18 -04:00
Chenlei Hu
55822faa05
[Type] Annotate graph.get_input_info ( #7386 )
...
* [Type] Annotate graph.get_input_info
* nit
* nit
2025-04-17 21:02:24 -04:00
patientx
5f477c98f7
Update README.md
2025-04-18 00:34:54 +03:00
patientx
0d1aecfb43
Merge branch 'comfyanonymous:master' into master
2025-04-18 00:34:40 +03:00
patientx
cb8c52cbb2
Update README.md
2025-04-18 00:34:33 +03:00
comfyanonymous
880c205df1
Add hidream to readme.
2025-04-17 16:58:27 -04:00
comfyanonymous
3dc240d089
Make fresca work on multi dim.
2025-04-17 15:46:41 -04:00
BVH
19373aee75
Add FreSca node ( #7631 )
2025-04-17 15:24:33 -04:00
patientx
0bf09baf27
Merge branch 'comfyanonymous:master' into master
2025-04-17 22:15:18 +03:00
patientx
71ac9830ab
updated comfyui-frontend version
2025-04-17 22:15:11 +03:00
patientx
95773a0045
updated "comfyui-frontend version" - added "comfyui-workflow-templates"
2025-04-17 22:13:36 +03:00
comfyanonymous
93292bc450
ComfyUI version 0.3.29
2025-04-17 14:45:01 -04:00
Christian Byrne
05d5a75cdc
Update frontend to 1.16 (Install templates as pip package) ( #7623 )
...
* install templates as pip package
* Update requirements.txt
* bump templates version to include hidream
---------
Co-authored-by: Chenlei Hu <hcl@comfy.org>
2025-04-17 14:25:33 -04:00
patientx
d9211bbf1e
Merge branch 'comfyanonymous:master' into master
2025-04-17 20:51:12 +03:00
comfyanonymous
eba7a25e7a
Add WanFirstLastFrameToVideo node to use the new model.
2025-04-17 13:23:22 -04:00
comfyanonymous
dbcfd092a2
Set default context_img_len to 257
2025-04-17 12:42:34 -04:00
comfyanonymous
c14429940f
Support loading WAN FLF model.
2025-04-17 12:04:48 -04:00
patientx
6bbe3b19d4
Merge branch 'comfyanonymous:master' into master
2025-04-17 14:51:08 +03:00
comfyanonymous
0d720e4367
Don't hardcode length of context_img in wan code.
2025-04-17 06:25:39 -04:00
patientx
7950c510e7
Merge branch 'comfyanonymous:master' into master
2025-04-17 10:10:03 +03:00
comfyanonymous
1fc00ba4b6
Make hidream work with any latent resolution.
2025-04-16 18:34:14 -04:00
patientx
d05e9b8298
Merge branch 'comfyanonymous:master' into master
2025-04-17 01:22:11 +03:00
comfyanonymous
9899d187b1
Limit T5 to 128 tokens for HiDream: #7620
2025-04-16 18:07:55 -04:00
comfyanonymous
f00f340a56
Reuse code from flux model.
2025-04-16 17:43:55 -04:00
patientx
503292b3c0
Merge branch 'comfyanonymous:master' into master
2025-04-16 23:26:10 +03:00
Chenlei Hu
cce1d9145e
[Type] Mark input options NotRequired ( #7614 )
2025-04-16 15:41:00 -04:00
patientx
7df5dbbee7
Update README.md
2025-04-16 17:08:08 +03:00
patientx
765224ad78
Merge branch 'comfyanonymous:master' into master
2025-04-16 12:10:01 +03:00
comfyanonymous
b4dc03ad76
Fix issue on old torch.
2025-04-16 04:53:56 -04:00
patientx
eee802e685
Update zluda.py
2025-04-16 00:53:18 +03:00
patientx
ad2fa1a675
Merge branch 'comfyanonymous:master' into master
2025-04-16 00:50:26 +03:00
comfyanonymous
9ad792f927
Basic support for hidream i1 model.
2025-04-15 17:35:05 -04:00
patientx
ab0860b96a
Merge branch 'comfyanonymous:master' into master
2025-04-15 21:37:22 +03:00
comfyanonymous
6fc5dbd52a
Cleanup.
2025-04-15 12:13:28 -04:00
patientx
af58afcae8
Merge branch 'comfyanonymous:master' into master
2025-04-15 18:23:31 +03:00
comfyanonymous
3e8155f7a3
More flexible long clip support.
...
Add clip g long clip support.
Text encoder refactor.
Support llama models with different vocab sizes.
2025-04-15 10:32:21 -04:00
patientx
09491fecbd
Merge branch 'comfyanonymous:master' into master
2025-04-15 14:41:15 +03:00