ComfyUI/comfy/ldm
Raphael Walker 61b50720d0
Add support for attention masking in Flux (#5942)
* fix attention OOM in xformers

* allow passing attention mask in flux attention

* allow an attn_mask in flux

* attn masks can be done using replace patches instead of a separate dict

* fix return types

* fix return order

* enumerate

* patch the right keys

* arg names

* fix a silly bug

* fix xformers masks

* replace match with if, elif, else

* mask with image_ref_size

* remove unused import

* remove unused import 2

* fix pytorch/xformers attention

This corrects a weird inconsistency with skip_reshape.
It also allows masks of various shapes to be passed, which will be
automtically expanded (in a memory-efficient way) to a size that is
compatible with xformers or pytorch sdpa respectively.

* fix mask shapes
2024-12-16 18:21:17 -05:00
..
audio Lint and fix undefined names (1/N) (#6028) 2024-12-12 18:55:26 -05:00
aura
cascade
flux Add support for attention masking in Flux (#5942) 2024-12-16 18:21:17 -05:00
genmo
hydit Lint all unused variables (#5989) 2024-12-12 17:59:16 -05:00
lightricks
models Lint and fix undefined names (1/N) (#6028) 2024-12-12 18:55:26 -05:00
modules
common_dit.py
util.py