mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2026-03-20 08:33:44 +08:00
In weight_decompose(), the 1D dora_scale tensor [N] divided by the multi-dimensional weight_norm [N, 1, ...] would incorrectly broadcast to [N, N, ...] (outer-product shape) instead of element-wise [N, 1, ...]. This caused shape mismatches when applying DoRA to non-square weight matrices (e.g. MLP layers where d_ff != d_model), while silently producing correct results for square weights (most attention Q/K/V/O). Fix: explicitly reshape dora_scale to match weight_norm's dimensionality before the division. Fixes #12938 Co-Authored-By: Claude (claude-opus-4-6) <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| base.py | ||
| boft.py | ||
| bypass.py | ||
| glora.py | ||
| loha.py | ||
| lokr.py | ||
| lora.py | ||
| oft.py | ||