Wing Lian
|
1f5d83ea72
|
remove un-needed code, add validation
|
2023-05-24 22:47:43 -04:00 |
|
Wing Lian
|
7e81ca720b
|
Update requirements.txt
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-24 15:44:48 -04:00 |
|
Wing Lian
|
e8aacfbd7c
|
more qlora support
|
2023-05-24 14:33:18 -04:00 |
|
Wing Lian
|
b9d07aa95a
|
prepare does all this already for qlora?
|
2023-05-24 14:32:39 -04:00 |
|
Wing Lian
|
3b4d055edd
|
integrate qlora? maybe?
|
2023-05-24 14:32:39 -04:00 |
|
Wing Lian
|
2ae936fbc4
|
fix missing fp16 kwarg
|
2023-05-23 20:44:24 -04:00 |
|
Wing Lian
|
fb100a9ee1
|
fix enum pass as value
|
2023-05-23 11:34:03 -04:00 |
|
Wing Lian
|
3a503770e4
|
Add qa style data for alpaca instructions, fix one_cycle scheduler
|
2023-05-22 22:58:10 -04:00 |
|
Wing Lian
|
b029a11e65
|
Merge pull request #34 from OpenAccess-AI-Collective/dev-unstable
lots of various improvements
|
2023-05-22 12:14:56 -04:00 |
|
Wing Lian
|
e3df3a9f5d
|
cuda/pytorch matrix builds
|
2023-05-22 12:14:21 -04:00 |
|
Wing Lian
|
f950a881e1
|
cuda, pytorch matrix for base builds
|
2023-05-22 12:12:08 -04:00 |
|
Wing Lian
|
de6da13e19
|
don't need to set here
|
2023-05-22 12:12:01 -04:00 |
|
Wing Lian
|
9493b1b137
|
be able to use adam bnb 8bit and one cycle scheduler w fsdp
|
2023-05-22 09:00:49 -04:00 |
|
Wing Lian
|
1b3e401241
|
Update src/axolotl/utils/models.py for info msg
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:01:35 -04:00 |
|
Wing Lian
|
3457810988
|
Update scripts/finetune.py
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:00:28 -04:00 |
|
Wing Lian
|
ae1719d30c
|
Update scripts/finetune.py for logging
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:00:23 -04:00 |
|
Wing Lian
|
98a6781f18
|
Update src/axolotl/utils/data.py for spelling
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:00:13 -04:00 |
|
Wing Lian
|
607a4d33f2
|
make sure to use train split if loading from hf
|
2023-05-21 22:04:39 -04:00 |
|
Wing Lian
|
99383f14a3
|
make one cycle lr div factor configurable
|
2023-05-21 20:25:06 -04:00 |
|
Wing Lian
|
0f74464652
|
fix new dataset prompt tokenizers
|
2023-05-21 18:57:09 -04:00 |
|
Wing Lian
|
e0602a9e54
|
add missing __init__
|
2023-05-21 16:36:41 -04:00 |
|
Wing Lian
|
2809f3f21b
|
pygmalion dataset prompts format, cached tokenized datasets should be hashed on the tokenizer too
|
2023-05-21 16:16:09 -04:00 |
|
Wing Lian
|
4ea9a66dbd
|
tokenization fixes
|
2023-05-21 08:33:06 -04:00 |
|
Wing Lian
|
1d5ab84486
|
optionally be able to specify alpaca or chat style prompts
|
2023-05-20 18:16:22 -04:00 |
|
Wing Lian
|
fa8bd14be4
|
update entrypoint and force min accelerate
|
2023-05-18 06:25:34 -04:00 |
|
Wing Lian
|
13650732f8
|
concise multiple choice and tldr summarize
|
2023-05-17 11:29:17 -04:00 |
|
Wing Lian
|
8c2f3cb0f8
|
support for replit lm
|
2023-05-17 08:49:03 -04:00 |
|
Wing Lian
|
b46bc02f0a
|
add alpaca multiple choice instruct dataset support
|
2023-05-16 21:45:34 -04:00 |
|
Wing Lian
|
f98e173b59
|
reorder options so debug can happen in the same prepare step
|
2023-05-15 22:26:30 -04:00 |
|
Wing Lian
|
5e37144754
|
fix prompters, especially the sharegpt prompter
|
2023-05-15 22:15:36 -04:00 |
|
Wing Lian
|
bdbca8fa6c
|
more fixes
|
2023-05-15 14:07:17 -04:00 |
|
Wing Lian
|
42410c783c
|
more fixes
|
2023-05-14 09:16:41 -04:00 |
|
Wing Lian
|
aef00b6c13
|
fix torch_dtype for model load
|
2023-05-14 08:44:22 -04:00 |
|
Wing Lian
|
0d28df0fd2
|
move filter to before saving so it doesn't happen everytime, update runpod manual script
|
2023-05-13 21:51:41 -04:00 |
|
Wing Lian
|
84c7bc4b68
|
whoops, gt vs lt
|
2023-05-12 14:03:25 -04:00 |
|
Wing Lian
|
aa3c3f97ae
|
optimize dataloading to use cache, fix model token embedding sizes
|
2023-05-12 13:53:27 -04:00 |
|
Wing Lian
|
f6d1fa4a85
|
Merge pull request #25 from NanoCode012/patch-2
Fix Trainer() got multiple values for keyword argument 'callbacks'
|
2023-05-11 09:20:15 -04:00 |
|
NanoCode012
|
89b7f26b9d
|
Merge branch 'main' into patch-2
|
2023-05-11 21:18:38 +09:00 |
|
Wing Lian
|
165da584b3
|
fix config for parity with previous change
5159d00a86\#diff-65b4693504c4e8ffac76c7f2c90913faee381f802cf64e7f49c995a2134ed3b3R164
|
2023-05-11 08:13:09 -04:00 |
|
Wing Lian
|
4cc7ed8898
|
Merge pull request #27 from NanoCode012/patch-1
Fix save typo
|
2023-05-11 07:27:31 -04:00 |
|
NanoCode012
|
52aada7174
|
Fix typo
|
2023-05-11 20:22:30 +09:00 |
|
Wing Lian
|
688c73a81e
|
Merge pull request #26 from OpenAccess-AI-Collective/mpt-triton
Mpt triton
|
2023-05-10 16:02:05 -04:00 |
|
Wing Lian
|
2bc1a5bde1
|
black formatting
|
2023-05-10 16:01:08 -04:00 |
|
Wing Lian
|
7a490a4646
|
various fixes
|
2023-05-10 16:00:09 -04:00 |
|
NanoCode012
|
813aab378f
|
Fix Trainer() got multiple values for keyword argument 'callbacks'
|
2023-05-10 18:28:28 +09:00 |
|
Wing Lian
|
e2e68c3965
|
testing mpt triton
|
2023-05-09 20:57:40 -04:00 |
|
Wing Lian
|
a27d594788
|
fix conditional so alpaca doesn't choke
|
2023-05-09 20:57:07 -04:00 |
|
Wing Lian
|
1fb0376150
|
Merge pull request #23 from NanoCode012/patch-1
Fix: Save adapter for lora
|
2023-05-09 15:05:58 -04:00 |
|
Wing Lian
|
915c56cd97
|
Update finetune.py
|
2023-05-09 15:05:39 -04:00 |
|
Wing Lian
|
df9c5085b5
|
not everyone has bf16 available
|
2023-05-09 14:47:48 -04:00 |
|