ComfyUI/comfy/ldm
Jedrzej Kosinski d7f40442f9
Enable Runtime Selection of Attention Functions (#9639)
* Looking into a @wrap_attn decorator to look for 'optimized_attention_override' entry in transformer_options

* Created logging code for this branch so that it can be used to track down all the code paths where transformer_options would need to be added

* Fix memory usage issue with inspect

* Made WAN attention receive transformer_options, test node added to wan to test out attention override later

* Added **kwargs to all attention functions so transformer_options could potentially be passed through

* Make sure wrap_attn doesn't make itself recurse infinitely, attempt to load SageAttention and FlashAttention if not enabled so that they can be marked as available or not, create registry for available attention

* Turn off attention logging for now, make AttentionOverrideTestNode have a dropdown with available attention (this is a test node only)

* Make flux work with optimized_attention_override

* Add logs to verify optimized_attention_override is passed all the way into attention function

* Make Qwen work with optimized_attention_override

* Made hidream work with optimized_attention_override

* Made wan patches_replace work with optimized_attention_override

* Made SD3 work with optimized_attention_override

* Made HunyuanVideo work with optimized_attention_override

* Made Mochi work with optimized_attention_override

* Made LTX work with optimized_attention_override

* Made StableAudio work with optimized_attention_override

* Made optimized_attention_override work with ACE Step

* Made Hunyuan3D work with optimized_attention_override

* Make CosmosPredict2 work with optimized_attention_override

* Made CosmosVideo work with optimized_attention_override

* Made Omnigen 2 work with optimized_attention_override

* Made StableCascade work with optimized_attention_override

* Made AuraFlow work with optimized_attention_override

* Made Lumina work with optimized_attention_override

* Made Chroma work with optimized_attention_override

* Made SVD work with optimized_attention_override

* Fix WanI2VCrossAttention so that it expects to receive transformer_options

* Fixed Wan2.1 Fun Camera transformer_options passthrough

* Fixed WAN 2.1 VACE transformer_options passthrough

* Add optimized to get_attention_function

* Disable attention logs for now

* Remove attention logging code

* Remove _register_core_attention_functions, as we wouldn't want someone to call that, just in case

* Satisfy ruff

* Remove AttentionOverrideTest node, that's something to cook up for later
2025-09-12 18:07:38 -04:00
..
ace Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
audio Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
aura Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
cascade Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
chroma Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
cosmos Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
flux Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
genmo Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hidream Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hunyuan3d Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hunyuan3dv2_1 Fix issue on old torch. (#9791) 2025-09-10 00:23:47 -04:00
hunyuan_video Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
hydit Change cosmos and hydit models to use the native RMSNorm. (#7934) 2025-05-04 06:26:20 -04:00
lightricks Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
lumina Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
models Implement hunyuan image refiner model. (#9817) 2025-09-12 00:43:20 -04:00
modules Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
omnigen Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
pixart Remove windows line endings. (#8866) 2025-07-11 02:37:51 -04:00
qwen_image Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
wan Enable Runtime Selection of Attention Functions (#9639) 2025-09-12 18:07:38 -04:00
common_dit.py add RMSNorm to comfy.ops 2025-04-14 18:00:33 -04:00
util.py Fix and enforce new lines at the end of files. 2024-12-30 04:14:59 -05:00