47 Commits

Author SHA1 Message Date
Steffen Röcker
945c4191a3 Use AutoTokenizer for redpajama example 2023-06-14 20:09:26 +02:00
Wing Lian
16bb6276a5 Merge pull request #92 from OpenAccess-AI-Collective/flash-optimum
add support for opimum bettertransformers
2023-06-14 07:50:15 -04:00
Wing Lian
fd2c9814c9 Merge branch 'main' into flash-optimum 2023-06-12 13:12:15 -04:00
Wing Lian
2ba4ae8f46 tweak config to work 2023-06-12 10:07:18 -04:00
Wing Lian
94f310c7a6 Merge pull request #193 from OpenAccess-AI-Collective/config-fixes-20230612
config fixes
2023-06-12 08:24:52 -04:00
NanoCode012
52cde69288 Fix config path after config moved 2023-06-12 17:06:15 +09:00
Wing Lian
9a58e99e81 config fixes 2023-06-12 01:52:58 -04:00
Wing Lian
6b3f509d9e forgot to add this file 2023-06-11 11:50:12 -04:00
Wing Lian
d0d7eaa4f3 update openllama and clean up paths 2023-06-11 11:03:31 -04:00
Wing Lian
effbbf6dd1 more pruning 2023-06-11 10:38:24 -04:00
Wing Lian
c530e4b9c8 more config pruning and migrating 2023-06-11 10:09:05 -04:00
Wing Lian
77762a5d6b get rid of some configs, formalize pythioa lora config 2023-06-11 09:41:41 -04:00
Wing Lian
0c6f928601 address PR feedback 2023-06-10 14:23:56 -04:00
Wing Lian
1db46a9c72 linting fix 2023-06-10 14:23:56 -04:00
Wing Lian
39619028a3 use pythia-12b, neox-20b is flaky 2023-06-10 14:22:30 -04:00
NanoCode012
c8242de725 Merge pull request #132 from utensil/falcon-7b-qlora
Axolotl supports falcon + qlora
2023-06-09 01:14:03 +09:00
Utensil
79a8f52181 Trim trailing whitespace 2023-06-08 23:48:57 +08:00
Utensil
a52f4816b0 Default wandb_project to empty as suggested
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
2023-06-08 23:04:19 +08:00
Utensil
c9c050316f Default micro_batch_size to 1 for a safer start 2023-06-03 17:26:33 +08:00
Utensil
ca11ae9689 Add comments/alternatives for falcon-qlora configs 2023-06-03 15:04:02 +08:00
Utensil
fb3d40f197 falcon + qlora + xformer mbs 40 gas 2 on A6000 2023-06-01 18:29:20 +08:00
Utensil
72bf8aafb6 Create config-7b-qlora.yml 2023-06-01 00:00:37 +08:00
Wing Lian
c2a0792680 swap batch size for gradient accumulation steps to decouple from num gpu 2023-05-31 09:38:12 -04:00
Wing Lian
4df9da74e3 Merge pull request #105 from viktoriussuwandi/viktoriussuwandi-patch
Viktoriussuwandi patch
2023-05-30 15:05:23 -04:00
Wing Lian
2531ea24c1 Merge pull request #106 from fearnworks/qlora-openllama-3b-example
Qlora openllama 3b example
2023-05-30 15:05:05 -04:00
NanoCode012
392dfd9b07 Lint and format 2023-05-31 02:53:22 +09:00
jphillips
6cee881d64 Update examples/qlora-openllama-3b/README.md
Co-authored-by: Wing Lian <wing.lian@gmail.com>
2023-05-30 09:33:33 -05:00
jphillips
ac85c0ed36 Add Readme, Clean up comments 2023-05-29 14:35:58 -05:00
jphillips
370d057096 Add qlora-openllama-3b example 2023-05-29 09:07:46 -05:00
Viktorius Suwandi
15e57ba6ee Update wandb_log_model on config.yml 2023-05-29 16:33:20 +07:00
Viktorius Suwandi
4eb68ac3f7 Update wandb_log_model on config-3b.yml 2023-05-29 16:32:49 +07:00
Viktorius Suwandi
fad06befee Update wandb_log_model on config.yml 2023-05-29 15:42:38 +07:00
Wing Lian
147241ca66 Merge branch 'main' into refactor/rename-4b-to-gptq 2023-05-27 09:37:52 -04:00
Wing Lian
157420df13 sane starter defaults and add lora 2023-05-27 09:33:14 -04:00
Wing Lian
d5f944ce2a add example for falcoln support 2023-05-27 09:16:43 -04:00
Thytu
7cf07fc8b3 refactor(example): rename 4bit-lora-7b by gptq-lora-7b
Signed-off-by: Thytu <vdmatos@gladia.io>
2023-05-27 12:37:53 +00:00
Thytu
dd0065773a refactor(param): rename load_4bit config param by gptq
Signed-off-by: Thytu <vdmatos@gladia.io>
2023-05-27 12:36:03 +00:00
NanoCode012
79560934f9 Disable formers for QuickStart config 2023-05-26 22:23:38 +09:00
Wing Lian
98b1bce57e pr comments addressed 2023-05-25 12:25:07 -04:00
Wing Lian
e396654319 fix tokenizer loading, got openllama 3b working 2023-05-25 12:15:12 -04:00
Wing Lian
a5d739b66b fixes w/ example for super basic lora starter 2023-05-25 11:59:08 -04:00
Wing Lian
8c2f3cb0f8 support for replit lm 2023-05-17 08:49:03 -04:00
Wing Lian
165da584b3 fix config for parity with previous change
5159d00a86\#diff-65b4693504c4e8ffac76c7f2c90913faee381f802cf64e7f49c995a2134ed3b3R164
2023-05-11 08:13:09 -04:00
Wing Lian
df9c5085b5 not everyone has bf16 available 2023-05-09 14:47:48 -04:00
Wing Lian
7967cd1039 add 4bit lora 7b 2023-05-09 14:38:32 -04:00
Wing Lian
02c59832a3 push up redpajama 3b example 2023-05-08 19:19:18 -04:00
Wing Lian
a125693122 add support for trust_remote_code for mpt models 2023-05-08 12:07:27 -04:00