Faria Huq
230e0ac363
Fix Lora config error for Llama3 ( #1659 )
...
The current yml code throws an error: ValueError: Please set lora_modules_to_save to [`embed_tokens`, `lm_head`] when using an adapter and changing the special tokens.
I added the required changes to resolve it
2024-05-28 11:25:08 -04:00
..
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-28 11:23:52 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-28 11:25:08 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00
2024-05-22 08:34:06 -04:00
2024-05-15 12:44:13 -04:00
2024-05-15 12:44:13 -04:00