Commit Graph

4262 Commits

Author SHA1 Message Date
patientx
bbcb33ea72
Merge branch 'comfyanonymous:master' into master 2025-05-26 16:26:39 +03:00
comfyanonymous
e5799c4899
Enable pytorch attention by default on AMD gfx1151 (#8282) 2025-05-26 04:29:25 -04:00
patientx
48bbdd0842
Merge branch 'comfyanonymous:master' into master 2025-05-25 15:38:51 +03:00
comfyanonymous
a0651359d7
Return proper error if diffusion model not detected properly. (#8272) 2025-05-25 05:28:11 -04:00
patientx
3239031333
Merge branch 'comfyanonymous:master' into master 2025-05-25 00:52:26 +03:00
comfyanonymous
ad3bd8aa49 ComfyUI version 0.3.36 2025-05-24 17:30:37 -04:00
patientx
9790aaac7b
Merge branch 'comfyanonymous:master' into master 2025-05-24 14:00:54 +03:00
comfyanonymous
5a87757ef9
Better error if sageattention is installed but a dependency is missing. (#8264) 2025-05-24 06:43:12 -04:00
patientx
a30b905a5a
Merge branch 'comfyanonymous:master' into master 2025-05-24 10:36:57 +03:00
Christian Byrne
464aece92b
update frontend package to v1.20.5 (#8260) 2025-05-23 21:53:49 -07:00
patientx
3b69a08c08
Merge branch 'comfyanonymous:master' into master 2025-05-24 04:06:28 +03:00
comfyanonymous
0b50d4c0db
Add argument to explicitly enable fp8 compute support. (#8257)
This can be used to test if your current GPU/pytorch version supports fp8 matrix mult in combination with --fast or the fp8_e4m3fn_fast dtype.
2025-05-23 17:43:50 -04:00
patientx
3e49c3e2ff
Merge branch 'comfyanonymous:master' into master 2025-05-24 00:01:56 +03:00
drhead
30b2eb8a93
create arange on-device (#8255) 2025-05-23 16:15:06 -04:00
patientx
25ff26df4a
Merge pull request #151 from LuXuxue/master
fix file copy if we need a different virtual_env name for test
2025-05-23 18:15:50 +03:00
LuXuxue
87345d5892
fix file copy if we need a different virtual_env name for test 2025-05-23 23:07:06 +08:00
LuXuxue
b69ba487b3
fix file copy if we need a different virtual_env name for test 2025-05-23 23:04:50 +08:00
patientx
c653935b37
Merge branch 'comfyanonymous:master' into master 2025-05-23 13:57:11 +03:00
patientx
bc8ed4e728
Merge pull request #150 from LuXuxue/master
add some architectures to utils.py
2025-05-23 13:56:59 +03:00
LuXuxue
dc4958db54
add some architectures to utils.py 2025-05-23 13:54:03 +08:00
comfyanonymous
f85c08df06
Make VACE conditionings stackable. (#8240) 2025-05-22 19:22:26 -04:00
patientx
bf94f31441
Merge branch 'comfyanonymous:master' into master 2025-05-22 15:27:44 +03:00
comfyanonymous
4202e956a0
Add append feature to conditioning_set_values (#8239)
Refactor unclipconditioning node.
2025-05-22 08:11:13 -04:00
Terry Jia
b838c36720
remove mtl from 3d model file list (#8192) 2025-05-22 08:08:36 -04:00
Chenlei Hu
fc39184ea9
Update frontend to 1.20 (#8232) 2025-05-22 02:24:36 -04:00
patientx
c3b1d07f5d
Merge branch 'comfyanonymous:master' into master 2025-05-22 00:06:48 +03:00
ComfyUI Wiki
ded60c33a0
Update templates to 0.1.18 (#8224) 2025-05-21 11:40:08 -07:00
patientx
40e835c7fc
Update README.md 2025-05-21 14:40:50 +03:00
patientx
9494f23ca5
Merge branch 'comfyanonymous:master' into master 2025-05-21 14:40:30 +03:00
patientx
220e304c13
Update README.md 2025-05-21 14:40:24 +03:00
Michael Abrahams
8bb858e4d3
Improve performance with large number of queued prompts (#8176)
* get_current_queue_volatile

* restore get_current_queue method

* remove extra import
2025-05-21 05:14:17 -04:00
编程界的小学生
57893c843f
Code Optimization and Issues Fixes in ComfyUI server (#8196)
* Update server.py

* Update server.py
2025-05-21 04:59:42 -04:00
Jedrzej Kosinski
65da29aaa9
Make torch.compile LoRA/key-compatible (#8213)
* Make torch compile node use wrapper instead of object_patch for the entire diffusion_models object, allowing key assotiations on diffusion_models to not break (loras, getting attributes, etc.)

* Moved torch compile code into comfy_api so it can be used by custom nodes with a degree of confidence

* Refactor set_torch_compile_wrapper to support a list of keys instead of just diffusion_model, as well as additional torch.compile args

* remove unused import

* Moved torch compile kwargs to be stored in model_options instead of attachments; attachments are more intended for things to be 'persisted', AKA not deepcopied

* Add some comments

* Remove random line of code, not sure how it got there
2025-05-21 04:56:56 -04:00
comfyanonymous
10024a38ea ComfyUI version v0.3.35 2025-05-21 04:50:37 -04:00
comfyanonymous
87f9130778
Revert "This doesn't seem to be needed on chroma. (#8209)" (#8210)
This reverts commit 7e84bf5373.
2025-05-20 05:39:55 -04:00
comfyanonymous
7e84bf5373
This doesn't seem to be needed on chroma. (#8209) 2025-05-20 05:29:23 -04:00
filtered
4f3b50ba51
Update README ROCm text to match link (#8199)
- Follow-up on #8198
2025-05-19 16:40:55 -04:00
comfyanonymous
e930a387d6
Update AMD instructions in README. (#8198) 2025-05-19 04:58:41 -04:00
patientx
220c937d6c
Update README.md 2025-05-19 02:28:47 +03:00
patientx
28be28a0cf
Update README.md 2025-05-18 11:37:35 +03:00
patientx
acf83b60f4
Merge branch 'comfyanonymous:master' into master 2025-05-18 11:37:17 +03:00
patientx
14f616b51b
Update README.md 2025-05-18 11:37:08 +03:00
comfyanonymous
d8e5662822
Remove default delimiter. (#8183) 2025-05-18 04:12:12 -04:00
LaVie024
3d44a09812
Update nodes_string.py (#8173) 2025-05-18 04:11:11 -04:00
comfyanonymous
62690eddec
Node to add pixel space noise to an image. (#8182) 2025-05-18 04:09:56 -04:00
Christian Byrne
05eb10b43a
Validate video inputs (#8133)
* validate kling lip sync input video

* add tooltips

* update duration estimates

* decrease epsilon

* fix rebase error
2025-05-18 04:08:47 -04:00
Silver
f5e4e976f4
Add missing category for T5TokenizerOption (#8177)
Change it if you need to but it should at least have a category.
2025-05-18 02:59:06 -04:00
comfyanonymous
aee2908d03
Remove useless log. (#8166) 2025-05-17 06:27:34 -04:00
patientx
3818e62ee9
Update README.md 2025-05-16 23:33:17 +03:00
patientx
1c7e37c6b0
Update README.md 2025-05-16 23:29:38 +03:00