Fix training over existing lora

When training with Lora, and starting with an existing lora weights, current code produces a model with 0 trainable params and training can't work.
Adding the "is_trainable" param allows the loaded peft to be trained and fixes the bug.
This commit is contained in:
Angainor Development
2023-06-08 09:18:58 +02:00
committed by GitHub
parent 6abfd87d44
commit 193c73bce0

View File

@@ -402,6 +402,7 @@ def load_lora(model, cfg):
model = PeftModel.from_pretrained(
model,
cfg.lora_model_dir,
is_trainable=True,
device_map=cfg.device_map,
# torch_dtype=torch.float16,
)