mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2025-12-25 14:00:49 +08:00
* fix attention OOM in xformers * allow passing attention mask in flux attention * allow an attn_mask in flux * attn masks can be done using replace patches instead of a separate dict * fix return types * fix return order * enumerate * patch the right keys * arg names * fix a silly bug * fix xformers masks * replace match with if, elif, else * mask with image_ref_size * remove unused import * remove unused import 2 * fix pytorch/xformers attention This corrects a weird inconsistency with skip_reshape. It also allows masks of various shapes to be passed, which will be automtically expanded (in a memory-efficient way) to a size that is compatible with xformers or pytorch sdpa respectively. * fix mask shapes |
||
|---|---|---|
| .. | ||
| controlnet.py | ||
| layers.py | ||
| math.py | ||
| model.py | ||
| redux.py | ||