Commit Graph

92 Commits

Author SHA1 Message Date
Wing Lian
de2a7335e6 Merge pull request #55 from OpenAccess-AI-Collective/missing-validation-file
add missing file
2023-05-25 09:58:51 -04:00
Wing Lian
1d7da3b389 add missing file 2023-05-25 09:58:29 -04:00
Wing Lian
f523a0894c stray s 2023-05-25 09:23:56 -04:00
Wing Lian
676d7da661 cfg.cfg fix, also de-dupe lora module list 2023-05-25 09:18:57 -04:00
Wing Lian
a8771b0aad fix tuple add to list 2023-05-24 23:46:04 -04:00
Wing Lian
1cf21daf51 Update src/axolotl/utils/models.py
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
2023-05-24 23:31:12 -04:00
Wing Lian
ffd1043607 attempt to find linear modules for qlora 2023-05-24 23:18:08 -04:00
Wing Lian
ce34d64e8a apply black formatting 2023-05-24 22:59:33 -04:00
Wing Lian
ce694e20a3 Merge branch 'main' of github.com:OpenAccess-AI-Collective/axolotl into dev 2023-05-24 22:59:09 -04:00
Wing Lian
1f5d83ea72 remove un-needed code, add validation 2023-05-24 22:47:43 -04:00
Valentin De Matos
88ad05df54 fix: handles AutoTokenizer from untrusted source
Set trust_remote_code param depending of cfg.trust_remote_code when calling AutoTokenizer.from_pretrained
2023-05-24 20:57:10 +02:00
Wing Lian
e8aacfbd7c more qlora support 2023-05-24 14:33:18 -04:00
Wing Lian
b9d07aa95a prepare does all this already for qlora? 2023-05-24 14:32:39 -04:00
Wing Lian
3b4d055edd integrate qlora? maybe? 2023-05-24 14:32:39 -04:00
Wing Lian
2ae936fbc4 fix missing fp16 kwarg 2023-05-23 20:44:24 -04:00
Wing Lian
fb100a9ee1 fix enum pass as value 2023-05-23 11:34:03 -04:00
Wing Lian
3a503770e4 Add qa style data for alpaca instructions, fix one_cycle scheduler 2023-05-22 22:58:10 -04:00
Wing Lian
de6da13e19 don't need to set here 2023-05-22 12:12:01 -04:00
Wing Lian
9493b1b137 be able to use adam bnb 8bit and one cycle scheduler w fsdp 2023-05-22 09:00:49 -04:00
Wing Lian
1b3e401241 Update src/axolotl/utils/models.py for info msg
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
2023-05-21 23:01:35 -04:00
Wing Lian
98a6781f18 Update src/axolotl/utils/data.py for spelling
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
2023-05-21 23:00:13 -04:00
Wing Lian
607a4d33f2 make sure to use train split if loading from hf 2023-05-21 22:04:39 -04:00
Wing Lian
99383f14a3 make one cycle lr div factor configurable 2023-05-21 20:25:06 -04:00
Wing Lian
0f74464652 fix new dataset prompt tokenizers 2023-05-21 18:57:09 -04:00
Wing Lian
e0602a9e54 add missing __init__ 2023-05-21 16:36:41 -04:00
Wing Lian
2809f3f21b pygmalion dataset prompts format, cached tokenized datasets should be hashed on the tokenizer too 2023-05-21 16:16:09 -04:00
Wing Lian
4ea9a66dbd tokenization fixes 2023-05-21 08:33:06 -04:00
Wing Lian
1d5ab84486 optionally be able to specify alpaca or chat style prompts 2023-05-20 18:16:22 -04:00
NanoCode012
641f8012f9 Set half using cfg.fp16 for 4bit 2023-05-20 02:29:31 +09:00
Wing Lian
13650732f8 concise multiple choice and tldr summarize 2023-05-17 11:29:17 -04:00
Wing Lian
8c2f3cb0f8 support for replit lm 2023-05-17 08:49:03 -04:00
Wing Lian
b46bc02f0a add alpaca multiple choice instruct dataset support 2023-05-16 21:45:34 -04:00
NanoCode012
2c73c81348 Add lora_modules_to_save 2023-05-16 19:22:00 +09:00
Wing Lian
5e37144754 fix prompters, especially the sharegpt prompter 2023-05-15 22:15:36 -04:00
Wing Lian
bdbca8fa6c more fixes 2023-05-15 14:07:17 -04:00
Wing Lian
42410c783c more fixes 2023-05-14 09:16:41 -04:00
Wing Lian
aef00b6c13 fix torch_dtype for model load 2023-05-14 08:44:22 -04:00
Wing Lian
0d28df0fd2 move filter to before saving so it doesn't happen everytime, update runpod manual script 2023-05-13 21:51:41 -04:00
Wing Lian
84c7bc4b68 whoops, gt vs lt 2023-05-12 14:03:25 -04:00
Wing Lian
aa3c3f97ae optimize dataloading to use cache, fix model token embedding sizes 2023-05-12 13:53:27 -04:00
NanoCode012
89b7f26b9d Merge branch 'main' into patch-2 2023-05-11 21:18:38 +09:00
Wing Lian
2bc1a5bde1 black formatting 2023-05-10 16:01:08 -04:00
Wing Lian
7a490a4646 various fixes 2023-05-10 16:00:09 -04:00
NanoCode012
813aab378f Fix Trainer() got multiple values for keyword argument 'callbacks' 2023-05-10 18:28:28 +09:00
Wing Lian
e2e68c3965 testing mpt triton 2023-05-09 20:57:40 -04:00
Wing Lian
a27d594788 fix conditional so alpaca doesn't choke 2023-05-09 20:57:07 -04:00
NanoCode012
174b74ddc9 Rename variable to use same convention 2023-05-09 02:49:44 +09:00
NanoCode012
cf681537ec Add CompletionPrompt type 2023-05-09 02:49:44 +09:00
Wing Lian
bd3c5a5cb3 Merge pull request #21 from NanoCode012/patch-1
Fix: Scheduler and optimizer condition
2023-05-08 13:34:44 -04:00
Wing Lian
bcbc99e655 Merge pull request #19 from NanoCode012/feat/callback-save-lora
Feat: Add callback save peft_model on_save
2023-05-08 13:34:07 -04:00