ComfyUI/comfy/ldm
John Pollock 6ba03a0747
Some checks failed
Python Linting / Run Ruff (push) Waiting to run
Python Linting / Run Pylint (push) Waiting to run
Build package / Build Test (3.10) (push) Has been cancelled
Build package / Build Test (3.11) (push) Has been cancelled
Build package / Build Test (3.12) (push) Has been cancelled
Build package / Build Test (3.13) (push) Has been cancelled
Build package / Build Test (3.14) (push) Has been cancelled
Merge upstream/master into pyisolate-support with DVRAM guard, init-order fix, and whitelist
- Resolve model_base.py: accept WAN21_FlowRVS, WAN21_SCAIL, IMG_TO_IMG_FLOW, guide_attention_entries from master
- main.py: AIMDO control init before isolation proxy init
- execution.py: guard vbars_reset_watermark_limits against missing backend
- pyproject.toml: host whitelist (13 entries, MultiGPU excluded as control)
- Baseline verified: 14/14 workflows PASS non-isolated with Dynamic VRAM active

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:39:02 -06:00
..
ace Make ace step 1.5 base model work properly with default workflow. (#12337) 2026-02-06 19:14:56 -05:00
anima Fix anima LLM adapter forward when manual cast (#12504) 2026-02-17 07:56:44 -08:00
audio Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
aura Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
cascade Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
chroma Implement NAG on all the models based on the Flux code. (#12500) 2026-02-16 23:30:34 -05:00
chroma_radiance Use torch RMSNorm for flux models and refactor hunyuan video code. (#12432) 2026-02-13 15:35:13 -05:00
cosmos Some fixes to previous pr. (#12339) 2026-02-06 20:14:52 -05:00
flux Implement NAG on all the models based on the Flux code. (#12500) 2026-02-16 23:30:34 -05:00
genmo Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hidream Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hunyuan3d Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hunyuan3dv2_1 Fix issue on old torch. (#9791) 2025-09-10 00:23:47 -04:00
hunyuan_video Implement NAG on all the models based on the Flux code. (#12500) 2026-02-16 23:30:34 -05:00
hydit
kandinsky5 Fix qwen scaled fp8 not working with kandinsky. Make basic t2i wf work. (#11162) 2025-12-06 17:50:10 -08:00
lightricks feat: per-guide attention strength control in self-attention (#12518) 2026-02-26 01:25:23 -05:00
lumina Only enable fp16 on z image models that actually support it. (#12065) 2026-01-24 22:32:28 -05:00
mmaudio/vae Implement the mmaudio VAE. (#10300) 2025-10-11 22:57:23 -04:00
models Flux 2 (#10879) 2025-11-25 10:50:19 -05:00
modules feat: Support SDPose-OOD (#12661) 2026-02-26 19:59:05 -05:00
omnigen Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
pixart
qwen_image Add working Qwen 2512 ControlNet (Fun ControlNet) support (#12359) 2026-02-13 22:23:52 -05:00
wan Merge upstream/master into pyisolate-support with DVRAM guard, init-order fix, and whitelist 2026-03-03 02:39:02 -06:00
common_dit.py
util.py New Year ruff cleanup. (#11595) 2026-01-01 22:06:14 -05:00