Wing Lian
|
c2a0792680
|
swap batch size for gradient accumulation steps to decouple from num gpu
|
2023-05-31 09:38:12 -04:00 |
|
Viktorius Suwandi
|
e0ccaccce2
|
Update wandb_log_model on vicuna_13B_4bit_reflect.yml
|
2023-05-29 16:34:13 +07:00 |
|
Viktorius Suwandi
|
b6a539b53c
|
Update wandb_log_model on cerebras_1_3B_alpaca.yml
|
2023-05-29 16:32:20 +07:00 |
|
Viktorius Suwandi
|
abddcf4dfe
|
Update wandb_log_model on pythia_1_2B_alpaca.yml
|
2023-05-29 16:31:53 +07:00 |
|
Viktorius Suwandi
|
15aabd2903
|
Update wandb_log_model on llama_7B_jeopardy.yml
|
2023-05-29 15:44:01 +07:00 |
|
Viktorius Suwandi
|
232b931081
|
Update wandb_log_model on llama_65B_alpaca.yml
|
2023-05-29 15:43:43 +07:00 |
|
Viktorius Suwandi
|
0736f4f9c1
|
Update wandb_log_model on llama_13B_alpaca.yml
|
2023-05-29 15:43:20 +07:00 |
|
Viktorius Suwandi
|
d77d736631
|
Update wandb_log_model on llama_7B_alpaca.yml
|
2023-05-29 15:43:01 +07:00 |
|
Viktorius Suwandi
|
2aacf75ee1
|
Update wandb_log_model on galactica_1_3B.yml
|
2023-05-29 15:42:19 +07:00 |
|
Viktorius Suwandi
|
71871345a6
|
Update wandb_log_model on llama_7B_4bit.yml
|
2023-05-29 15:41:59 +07:00 |
|
Viktorius Suwandi
|
0d14e951a8
|
Update wandb_log_model on stability_3b.yml
|
2023-05-29 15:41:42 +07:00 |
|
Viktorius Suwandi
|
84fc217f79
|
Update wandb_log_model on gpt_neox_20b.yml
|
2023-05-29 15:41:24 +07:00 |
|
Viktorius Suwandi
|
f317296259
|
Update wandb_log_model on quickstart.yml
|
2023-05-29 15:40:58 +07:00 |
|
Viktorius Suwandi
|
42a971df32
|
Update wandb_log_model on sample.yml
|
2023-05-29 15:39:42 +07:00 |
|
Thytu
|
dd0065773a
|
refactor(param): rename load_4bit config param by gptq
Signed-off-by: Thytu <vdmatos@gladia.io>
|
2023-05-27 12:36:03 +00:00 |
|
Wing Lian
|
165da584b3
|
fix config for parity with previous change
5159d00a86\#diff-65b4693504c4e8ffac76c7f2c90913faee381f802cf64e7f49c995a2134ed3b3R164
|
2023-05-11 08:13:09 -04:00 |
|
Wing Lian
|
a12fb0a8da
|
Jeopardy bot! (#17)
* support for jeopardy dataset
* commit the final config for jeopardy bot
|
2023-05-08 03:21:40 -04:00 |
|
Wing Lian
|
4818380fa6
|
update stablelm config
|
2023-05-07 01:58:23 -04:00 |
|
Wing Lian
|
4a17a4c9a1
|
fix dataset handling, support galactica
|
2023-04-24 10:54:45 -04:00 |
|
Wing Lian
|
097d367af6
|
tweaks to data loading, 8 bit adam, accelerate and deepspeed
|
2023-04-24 09:41:35 -04:00 |
|
Wing Lian
|
8d437853c8
|
fix sharegpt handling from hf, don't worry about loading llama if using earlier transformers release
|
2023-04-24 09:41:35 -04:00 |
|
Wing Lian
|
94f5e415a3
|
various bugfixes
|
2023-04-24 09:41:34 -04:00 |
|
Wing Lian
|
0a472e1e08
|
quickstart instructions for starting from runpod (#5)
|
2023-04-18 19:22:25 -04:00 |
|
Wing Lian
|
6045345d6b
|
WIP large refactor to make finetune script a little more manageable (#3)
|
2023-04-18 14:01:38 -04:00 |
|
Wing Lian
|
81de0efc18
|
add support for alpaca reflect training (#2)
|
2023-04-18 08:34:05 -04:00 |
|
Wing Lian
|
87e073d0de
|
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
|
2023-04-17 18:01:12 -04:00 |
|
Wing Lian
|
77fca25f1b
|
4bit quantized support (wip)
|
2023-04-17 11:37:39 -04:00 |
|
Wing Lian
|
d1aed4c8e5
|
deepspeed doesn't work with flash-attn, and the gpu savings w flash attn are better than the deepspeed headaches
|
2023-04-16 06:59:47 -04:00 |
|
Wing Lian
|
d060c803ce
|
add llama 7b config and fiz lora_fan_in_fan_out for llama (copy pasta bug)
|
2023-04-15 14:26:52 -04:00 |
|
Wing Lian
|
05fffb53b4
|
more logging, wandb fixes
|
2023-04-15 13:37:17 -04:00 |
|
Wing Lian
|
b164725417
|
improve prepared dataset loading, fix inference
|
2023-04-15 12:14:52 -04:00 |
|
Wing Lian
|
937f44f021
|
helpful info output
|
2023-04-15 00:03:43 -04:00 |
|
Wing Lian
|
80b2ed29d8
|
various bugfixes
|
2023-04-14 21:37:07 -04:00 |
|
Wing Lian
|
949a27be21
|
more fixes and prep for llama training
|
2023-04-14 18:30:09 -04:00 |
|
Wing Lian
|
f2a2029d0d
|
config chooser, update readme instructions, device config, llama flash attention, debug out the labels, fix config key checks, other bugfixes
|
2023-04-14 12:18:56 -04:00 |
|
Wing Lian
|
8d959a7e26
|
make it work with pythia in the cloud
|
2023-04-14 07:24:55 -04:00 |
|
Wing Lian
|
ce24f5e246
|
WIP for axolotl trainer
|
2023-04-14 00:20:05 -04:00 |
|