mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2026-03-30 21:43:43 +08:00
Fix QuantizedTensor weight restore failure causing progressive drift
When weight_inplace_update=True and a set_func (e.g. fp8_scaled ops) replaces the parameter with a new QuantizedTensor, the backup records inplace_update=True. On restore, unpatch_model uses copy_to_param which calls param.data.copy_(). QuantizedTensor.__torch_dispatch__ routes copy_() through _dequant_and_fallback, which copies into a temporary float tensor without updating the underlying quantized data. The weight silently remains LoRA-patched after "restore", and the next generation backs up the already-patched weight and applies LoRA again, causing progressive quality degradation (washout). Fix: when set_func is present, force backup_inplace=False so restore uses set_attr_param (parameter replacement) instead of copy_to_param (in-place copy). set_attr_param correctly replaces the parameter object, which works for both regular tensors and QuantizedTensors. Fixes #11021
This commit is contained in:
parent
3814bf4454
commit
19eb196941
@ -692,7 +692,15 @@ class ModelPatcher:
|
||||
inplace_update = self.weight_inplace_update or inplace_update
|
||||
|
||||
if key not in self.backup and not return_weight:
|
||||
self.backup[key] = collections.namedtuple('Dimension', ['weight', 'inplace_update'])(weight.to(device=self.offload_device, copy=inplace_update), inplace_update)
|
||||
# When set_func is present (e.g. QuantizedTensor/fp8_scaled ops), it replaces
|
||||
# the parameter object rather than modifying it in-place. The restore path
|
||||
# must therefore also replace the parameter (set_attr_param) instead of doing
|
||||
# an in-place copy (copy_to_param), because QuantizedTensor.__torch_dispatch__
|
||||
# routes copy_() through dequant-and-fallback which silently fails to update
|
||||
# the underlying quantized data. Force inplace_update=False in the backup
|
||||
# for these keys so unpatch_model uses set_attr_param for restoration.
|
||||
backup_inplace = inplace_update if set_func is None else False
|
||||
self.backup[key] = collections.namedtuple('Dimension', ['weight', 'inplace_update'])(weight.to(device=self.offload_device, copy=inplace_update), backup_inplace)
|
||||
|
||||
temp_dtype = comfy.model_management.lora_compute_dtype(device_to)
|
||||
if device_to is not None:
|
||||
|
||||
Loading…
Reference in New Issue
Block a user