Wing Lian
|
147241ca66
|
Merge branch 'main' into refactor/rename-4b-to-gptq
|
2023-05-27 09:37:52 -04:00 |
|
Wing Lian
|
4c906339f7
|
fix auto linear modules for lora w/o any set already
|
2023-05-27 08:49:43 -04:00 |
|
Wing Lian
|
4c500f5830
|
checking for False is not sufficent for NoneType/unset configs
|
2023-05-27 08:43:48 -04:00 |
|
Thytu
|
dd0065773a
|
refactor(param): rename load_4bit config param by gptq
Signed-off-by: Thytu <vdmatos@gladia.io>
|
2023-05-27 12:36:03 +00:00 |
|
NanoCode012
|
ec3c0314bf
|
Merge pull request #65 from NanoCode012/feat/target-linear
Feat: Add `cfg.lora_target_linear`
|
2023-05-26 22:39:38 +09:00 |
|
NanoCode012
|
fe0e69f4f9
|
Fix recommendation condition
|
2023-05-26 22:19:50 +09:00 |
|
NanoCode012
|
919623793a
|
Add cfg.lora_target_linear
|
2023-05-26 14:32:30 +09:00 |
|
Wing Lian
|
a5bf838685
|
add logging and make sure model unloads to float16
|
2023-05-26 00:09:55 -04:00 |
|
Wing Lian
|
a4f12415a0
|
update readme and add typehints
|
2023-05-25 23:10:11 -04:00 |
|
Wing Lian
|
48f4c0571e
|
fix validation for qlora merge
|
2023-05-25 23:02:03 -04:00 |
|
Wing Lian
|
1987e5cf56
|
qlora and 4bit check so we are able to merge and unload
|
2023-05-25 22:55:13 -04:00 |
|
Wing Lian
|
7b5e762be2
|
fix merge conflict failure, black format
|
2023-05-25 22:40:27 -04:00 |
|
Wing Lian
|
34c99f9812
|
fixes to make qlora actually work
|
2023-05-25 22:37:23 -04:00 |
|
Wing Lian
|
2e56203b50
|
another fix for shard and train split
|
2023-05-25 17:23:57 -04:00 |
|
Wing Lian
|
ac79360161
|
shard fix
|
2023-05-25 16:31:59 -04:00 |
|
Wing Lian
|
943961fd10
|
missed ...
|
2023-05-25 12:42:56 -04:00 |
|
Wing Lian
|
d2a6f79fd1
|
change auth token setting back
|
2023-05-25 12:41:17 -04:00 |
|
Wing Lian
|
004820209d
|
Update src/axolotl/prompters.py
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-25 12:21:02 -04:00 |
|
Wing Lian
|
e396654319
|
fix tokenizer loading, got openllama 3b working
|
2023-05-25 12:15:12 -04:00 |
|
Wing Lian
|
a5d739b66b
|
fixes w/ example for super basic lora starter
|
2023-05-25 11:59:08 -04:00 |
|
Wing Lian
|
de2a7335e6
|
Merge pull request #55 from OpenAccess-AI-Collective/missing-validation-file
add missing file
|
2023-05-25 09:58:51 -04:00 |
|
Wing Lian
|
1d7da3b389
|
add missing file
|
2023-05-25 09:58:29 -04:00 |
|
Wing Lian
|
f523a0894c
|
stray s
|
2023-05-25 09:23:56 -04:00 |
|
Wing Lian
|
676d7da661
|
cfg.cfg fix, also de-dupe lora module list
|
2023-05-25 09:18:57 -04:00 |
|
Wing Lian
|
a8771b0aad
|
fix tuple add to list
|
2023-05-24 23:46:04 -04:00 |
|
Wing Lian
|
1cf21daf51
|
Update src/axolotl/utils/models.py
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-24 23:31:12 -04:00 |
|
Wing Lian
|
ffd1043607
|
attempt to find linear modules for qlora
|
2023-05-24 23:18:08 -04:00 |
|
Wing Lian
|
ce34d64e8a
|
apply black formatting
|
2023-05-24 22:59:33 -04:00 |
|
Wing Lian
|
ce694e20a3
|
Merge branch 'main' of github.com:OpenAccess-AI-Collective/axolotl into dev
|
2023-05-24 22:59:09 -04:00 |
|
Wing Lian
|
1f5d83ea72
|
remove un-needed code, add validation
|
2023-05-24 22:47:43 -04:00 |
|
Valentin De Matos
|
88ad05df54
|
fix: handles AutoTokenizer from untrusted source
Set trust_remote_code param depending of cfg.trust_remote_code when calling AutoTokenizer.from_pretrained
|
2023-05-24 20:57:10 +02:00 |
|
Wing Lian
|
e8aacfbd7c
|
more qlora support
|
2023-05-24 14:33:18 -04:00 |
|
Wing Lian
|
b9d07aa95a
|
prepare does all this already for qlora?
|
2023-05-24 14:32:39 -04:00 |
|
Wing Lian
|
3b4d055edd
|
integrate qlora? maybe?
|
2023-05-24 14:32:39 -04:00 |
|
Wing Lian
|
2ae936fbc4
|
fix missing fp16 kwarg
|
2023-05-23 20:44:24 -04:00 |
|
Wing Lian
|
fb100a9ee1
|
fix enum pass as value
|
2023-05-23 11:34:03 -04:00 |
|
Wing Lian
|
3a503770e4
|
Add qa style data for alpaca instructions, fix one_cycle scheduler
|
2023-05-22 22:58:10 -04:00 |
|
Wing Lian
|
de6da13e19
|
don't need to set here
|
2023-05-22 12:12:01 -04:00 |
|
Wing Lian
|
9493b1b137
|
be able to use adam bnb 8bit and one cycle scheduler w fsdp
|
2023-05-22 09:00:49 -04:00 |
|
Wing Lian
|
1b3e401241
|
Update src/axolotl/utils/models.py for info msg
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:01:35 -04:00 |
|
Wing Lian
|
98a6781f18
|
Update src/axolotl/utils/data.py for spelling
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
|
2023-05-21 23:00:13 -04:00 |
|
Wing Lian
|
607a4d33f2
|
make sure to use train split if loading from hf
|
2023-05-21 22:04:39 -04:00 |
|
Wing Lian
|
99383f14a3
|
make one cycle lr div factor configurable
|
2023-05-21 20:25:06 -04:00 |
|
Wing Lian
|
0f74464652
|
fix new dataset prompt tokenizers
|
2023-05-21 18:57:09 -04:00 |
|
Wing Lian
|
e0602a9e54
|
add missing __init__
|
2023-05-21 16:36:41 -04:00 |
|
Wing Lian
|
2809f3f21b
|
pygmalion dataset prompts format, cached tokenized datasets should be hashed on the tokenizer too
|
2023-05-21 16:16:09 -04:00 |
|
Wing Lian
|
4ea9a66dbd
|
tokenization fixes
|
2023-05-21 08:33:06 -04:00 |
|
Wing Lian
|
1d5ab84486
|
optionally be able to specify alpaca or chat style prompts
|
2023-05-20 18:16:22 -04:00 |
|
NanoCode012
|
641f8012f9
|
Set half using cfg.fp16 for 4bit
|
2023-05-20 02:29:31 +09:00 |
|
Wing Lian
|
13650732f8
|
concise multiple choice and tldr summarize
|
2023-05-17 11:29:17 -04:00 |
|