Commit Graph

23 Commits

Author SHA1 Message Date
Thytu
dd0065773a refactor(param): rename load_4bit config param by gptq
Signed-off-by: Thytu <vdmatos@gladia.io>
2023-05-27 12:36:03 +00:00
Wing Lian
165da584b3 fix config for parity with previous change
5159d00a86\#diff-65b4693504c4e8ffac76c7f2c90913faee381f802cf64e7f49c995a2134ed3b3R164
2023-05-11 08:13:09 -04:00
Wing Lian
a12fb0a8da Jeopardy bot! (#17)
* support for jeopardy dataset

* commit the final config for jeopardy bot
2023-05-08 03:21:40 -04:00
Wing Lian
4818380fa6 update stablelm config 2023-05-07 01:58:23 -04:00
Wing Lian
4a17a4c9a1 fix dataset handling, support galactica 2023-04-24 10:54:45 -04:00
Wing Lian
097d367af6 tweaks to data loading, 8 bit adam, accelerate and deepspeed 2023-04-24 09:41:35 -04:00
Wing Lian
8d437853c8 fix sharegpt handling from hf, don't worry about loading llama if using earlier transformers release 2023-04-24 09:41:35 -04:00
Wing Lian
94f5e415a3 various bugfixes 2023-04-24 09:41:34 -04:00
Wing Lian
0a472e1e08 quickstart instructions for starting from runpod (#5) 2023-04-18 19:22:25 -04:00
Wing Lian
6045345d6b WIP large refactor to make finetune script a little more manageable (#3) 2023-04-18 14:01:38 -04:00
Wing Lian
81de0efc18 add support for alpaca reflect training (#2) 2023-04-18 08:34:05 -04:00
Wing Lian
87e073d0de fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets 2023-04-17 18:01:12 -04:00
Wing Lian
77fca25f1b 4bit quantized support (wip) 2023-04-17 11:37:39 -04:00
Wing Lian
d1aed4c8e5 deepspeed doesn't work with flash-attn, and the gpu savings w flash attn are better than the deepspeed headaches 2023-04-16 06:59:47 -04:00
Wing Lian
d060c803ce add llama 7b config and fiz lora_fan_in_fan_out for llama (copy pasta bug) 2023-04-15 14:26:52 -04:00
Wing Lian
05fffb53b4 more logging, wandb fixes 2023-04-15 13:37:17 -04:00
Wing Lian
b164725417 improve prepared dataset loading, fix inference 2023-04-15 12:14:52 -04:00
Wing Lian
937f44f021 helpful info output 2023-04-15 00:03:43 -04:00
Wing Lian
80b2ed29d8 various bugfixes 2023-04-14 21:37:07 -04:00
Wing Lian
949a27be21 more fixes and prep for llama training 2023-04-14 18:30:09 -04:00
Wing Lian
f2a2029d0d config chooser, update readme instructions, device config, llama flash attention, debug out the labels, fix config key checks, other bugfixes 2023-04-14 12:18:56 -04:00
Wing Lian
8d959a7e26 make it work with pythia in the cloud 2023-04-14 07:24:55 -04:00
Wing Lian
ce24f5e246 WIP for axolotl trainer 2023-04-14 00:20:05 -04:00