Fix Lora config error for Llama3 (#1659)

The current yml code throws an error: ValueError: Please set lora_modules_to_save to [`embed_tokens`, `lm_head`] when using an adapter and changing the special tokens.

I added the required changes to resolve it
This commit is contained in:
Faria Huq
2024-05-28 11:25:08 -04:00
committed by GitHub
parent cc11c6bce2
commit 230e0ac363

View File

@@ -24,6 +24,9 @@ lora_alpha: 16
lora_dropout: 0.05
lora_target_linear: true
lora_fan_in_fan_out:
lora_modules_to_save:
- embed_tokens
- lm_head
wandb_project:
wandb_entity: