Comfy Org PR Bot
fca304debf
Update frontend to v1.8.14 ( #6724 )
...
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-02-06 10:43:10 -05:00
comfyanonymous
14880e6dba
Remove some useless code.
2025-02-06 05:00:37 -05:00
Chenlei Hu
f1059b0b82
Remove unused GET /files API endpoint ( #6714 )
2025-02-05 18:48:36 -05:00
doctorpangloss
3f1f427ff4
Distinct Seed and Seed64 input specs. numpy only supports 32 bit seeds
2025-02-05 14:08:09 -08:00
doctorpangloss
6ab1aa1e8a
Improving MLLM/VLLM support and fixing bugs
...
- fix #29 str(model) no longer raises exceptions like with
HyVideoModelLoader
- don't try to format CUDA tensors because that can sometimes raise
exceptions
- cudaAllocAsync has been disabled for now due to 2.6.0 bugs
- improve florence2 support
- add support for paligemma 2. This requires the fix for transformers
that is currently staged in another repo, install with
`uv pip install --no-deps "transformers@git+https://github.com/zucchini-nlp/transformers.git#branch=paligemma-fix-kwargs "`
- triton has been updated
- fix missing __init__.py files
2025-02-05 14:02:28 -08:00
comfyanonymous
debabccb84
Bump ComfyUI version to v0.3.14
2025-02-05 15:48:13 -05:00
comfyanonymous
37cd448529
Set the shift for Lumina back to 6.
2025-02-05 14:49:52 -05:00
comfyanonymous
94f21f9301
Upcasting rope to fp32 seems to make no difference in this model.
2025-02-05 04:32:47 -05:00
comfyanonymous
60653004e5
Use regular numbers for rope in lumina model.
2025-02-05 04:17:25 -05:00
comfyanonymous
a57d635c5f
Fix lumina 2 batches.
2025-02-04 21:48:11 -05:00
doctorpangloss
dcac115f68
Revert "Update logging when models are loaded"
...
This reverts commit 0d15a091c2 .
2025-02-04 15:18:00 -08:00
doctorpangloss
80db9a8e25
Florence2
2025-02-04 15:17:14 -08:00
doctorpangloss
ce3583ad42
relax numpy requirements
2025-02-04 09:40:22 -08:00
doctorpangloss
1a24ceef79
Updates for torch 2.6.0, prepare Anthropic nodes, accept multiple logging levels
2025-02-04 09:27:18 -08:00
Benjamin Berman
fac670da89
Merge pull request #28 from leszko/patch-2
...
Update logging when models are loaded
2025-02-04 08:28:30 -08:00
Rafał Leszko
0d15a091c2
Update logging when models are loaded
...
The "Loaded " log was logged even if no model were actually loaded into VRAM
2025-02-04 14:44:12 +01:00
comfyanonymous
016b219dcc
Add Lumina Image 2.0 to Readme.
2025-02-04 08:08:36 -05:00
comfyanonymous
8ac2dddeed
Lower the default shift of lumina to reduce artifacts.
2025-02-04 06:50:37 -05:00
comfyanonymous
3e880ac709
Fix on python 3.9
2025-02-04 04:20:56 -05:00
comfyanonymous
e5ea112a90
Support Lumina 2 model.
2025-02-04 04:16:30 -05:00
Raphael Walker
8d88bfaff9
allow searching for new .pt2 extension, which can contain AOTI compiled modules ( #6689 )
2025-02-03 17:07:35 -05:00
comfyanonymous
ed4d92b721
Model merging nodes for cosmos.
2025-02-03 03:31:39 -05:00
Comfy Org PR Bot
932ae8d9ca
Update frontend to v1.8.13 ( #6682 )
...
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-02-02 17:54:44 -05:00
comfyanonymous
44e19a28d3
Use maximum negative value instead of -inf for masks in text encoders.
...
This is probably more correct.
2025-02-02 09:46:00 -05:00
Dr.Lt.Data
0a0df5f136
better guide message for sageattention ( #6634 )
2025-02-02 09:26:47 -05:00
KarryCharon
24d6871e47
add disable-compres-response-body cli args; add compress middleware; ( #6672 )
2025-02-02 09:24:55 -05:00
comfyanonymous
9e1d301129
Only use stable cascade lora format with cascade model.
2025-02-01 06:35:22 -05:00
doctorpangloss
1488f2c59b
Logger should check attributes on current sys.stdout, which may have been overwritten
2025-01-31 10:58:11 -08:00
Terry Jia
768e035868
Add node for preview 3d animation ( #6594 )
...
* Add node for preview 3d animation
* remove bg_color param
* remove animation_speed param
2025-01-31 10:09:07 -08:00
Comfy Org PR Bot
669e0497ea
Update frontend to v1.8.12 ( #6662 )
...
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
2025-01-31 10:07:37 -08:00
doctorpangloss
c1b173e62a
Update to cu124 for torch 2.6.0 support
2025-01-31 09:13:47 -08:00
comfyanonymous
541dc08547
Update Readme.
2025-01-31 08:35:48 -05:00
comfyanonymous
8d8dc9a262
Allow batch of different sigmas when noise scaling.
2025-01-30 06:49:52 -05:00
comfyanonymous
2f98c24360
Update Readme with link to instruction for Nvidia 50 series.
2025-01-30 02:12:43 -05:00
comfyanonymous
ef85058e97
Bump ComfyUI version to v0.3.13
2025-01-29 16:07:12 -05:00
comfyanonymous
f9230bd357
Update the python version in some workflows.
2025-01-29 15:54:13 -05:00
comfyanonymous
537c27cbf3
Bump default cuda version in standalone package to 126.
2025-01-29 08:13:33 -05:00
comfyanonymous
6ff2e4d550
Remove logging call added in last commit.
...
This is called before the logging is set up so it messes up some things.
2025-01-29 08:08:01 -05:00
filtered
222f48c0f2
Allow changing folder_paths.base_path via command line argument. ( #6600 )
...
* Reimpl. CLI arg directly inside folder_paths.
* Update tests to use CLI arg mocking.
* Revert last-minute refactor.
* Fix test state polution.
2025-01-29 08:06:28 -05:00
doctorpangloss
95a12f42e2
Fix pylint errors (they were real, as they usually are)
2025-01-28 17:16:15 -08:00
doctorpangloss
d24098cd9b
Fix mask uploads
2025-01-28 16:34:59 -08:00
doctorpangloss
044dff6887
Updates and fixes
...
- Update to latest triton
- Fix huggingface hub automatic downloads
- Latest transformers may require updating huggingface llava models
- Compiling flux with fp8 weights is not supported
2025-01-28 16:22:09 -08:00
doctorpangloss
a3452f6e6a
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2025-01-28 13:45:51 -08:00
comfyanonymous
13fd4d6e45
More friendly error messages for corrupted safetensors files.
2025-01-28 09:41:09 -05:00
Bradley Reynolds
1210d094c7
Convert latents_ubyte to 8-bit unsigned int before converting to CPU ( #6300 )
...
* Convert latents_ubyte to 8-bit unsigned int before converting to CPU
* Only convert to unint8 if directml_enabled
2025-01-28 08:22:54 -05:00
comfyanonymous
255edf2246
Lower minimum ratio of loaded weights on Nvidia.
2025-01-27 05:26:51 -05:00
comfyanonymous
4f011b9a00
Better CLIPTextEncode error when clip input is None.
2025-01-26 06:04:57 -05:00
comfyanonymous
67feb05299
Remove redundant code.
2025-01-25 19:04:53 -05:00
comfyanonymous
6d21740346
Print ComfyUI version.
2025-01-25 15:03:57 -05:00
comfyanonymous
7fbf4b72fe
Update nightly pytorch ROCm command in Readme.
2025-01-24 06:15:54 -05:00