Commit Graph

285 Commits

Author SHA1 Message Date
NanoCode012
ed2dd77e35 Merge pull request #89 from OpenAccess-AI-Collective/NanoCode012-update-action-version
Feat: Update actions version
2023-05-28 02:12:26 +09:00
NanoCode012
2b8c28bab8 Update actions version 2023-05-28 01:51:10 +09:00
NanoCode012
782996d94a Merge pull request #86 from OpenAccess-AI-Collective/NanoCode012-warning-remote-code
Feat:  Add warning for `trust_remote_code`
2023-05-28 01:29:35 +09:00
NanoCode012
b50d7d311c Merge pull request #88 from OpenAccess-AI-Collective/NanoCode012-completion-prompter-no-inherit
Fix: Remove base class inherit for CompletionPrompter
2023-05-28 01:29:03 +09:00
Wing Lian
35af017001 Merge pull request #87 from OpenAccess-AI-Collective/add_prompter_tests
automated testing in github actions
2023-05-27 12:21:23 -04:00
Wing Lian
a653392287 use requirements file for tests 2023-05-27 12:17:46 -04:00
Wing Lian
72b6ca0d9f cache pip 2023-05-27 12:16:54 -04:00
Wing Lian
7f53fd2ab6 alright, just local install it 2023-05-27 12:16:06 -04:00
Wing Lian
c29d33352c move python path to same step as tests 2023-05-27 12:06:23 -04:00
Wing Lian
403af0b1d7 fix path and streamline pip installs 2023-05-27 11:58:37 -04:00
NanoCode012
9ac1884323 Fix: Remove base class inherit for CompletionPrompter 2023-05-28 00:51:35 +09:00
Wing Lian
d199d6c261 automated testing in github actions 2023-05-27 11:51:01 -04:00
NanoCode012
2824423d10 Add warning for trust_remote_code 2023-05-28 00:46:56 +09:00
NanoCode012
cb18856fc2 Merge pull request #85 from NanoCode012/fix/add-dataset-shard-readme
Feat: Add `dataset_shard_num` and `dataset_shard_idx` to Readme
2023-05-27 23:52:50 +09:00
NanoCode012
8626b54aab Add dataset_shard_num and dataset_shard_idx 2023-05-27 23:51:17 +09:00
Wing Lian
87dffbc451 Merge pull request #75 from Thytu/refactor/rename-4b-to-gptq
refactor: change 4bit nomenclature to gptq
2023-05-27 09:37:57 -04:00
Wing Lian
147241ca66 Merge branch 'main' into refactor/rename-4b-to-gptq 2023-05-27 09:37:52 -04:00
Wing Lian
7e974decb7 Merge pull request #76 from OpenAccess-AI-Collective/truthy-validation
Truthy validation
2023-05-27 09:36:10 -04:00
Wing Lian
11fd39b1f5 Merge pull request #78 from OpenAccess-AI-Collective/falcoln-support
falcon: sane starter defaults and add lora support
2023-05-27 09:35:56 -04:00
Wing Lian
157420df13 sane starter defaults and add lora 2023-05-27 09:33:14 -04:00
Wing Lian
679ffd7395 Merge pull request #77 from OpenAccess-AI-Collective/falcoln-support
add example for falcon support
2023-05-27 09:18:48 -04:00
Wing Lian
d5f944ce2a add example for falcoln support 2023-05-27 09:16:43 -04:00
Wing Lian
4c906339f7 fix auto linear modules for lora w/o any set already 2023-05-27 08:49:43 -04:00
Wing Lian
4c500f5830 checking for False is not sufficent for NoneType/unset configs 2023-05-27 08:43:48 -04:00
Thytu
7cf07fc8b3 refactor(example): rename 4bit-lora-7b by gptq-lora-7b
Signed-off-by: Thytu <vdmatos@gladia.io>
2023-05-27 12:37:53 +00:00
Thytu
dd0065773a refactor(param): rename load_4bit config param by gptq
Signed-off-by: Thytu <vdmatos@gladia.io>
2023-05-27 12:36:03 +00:00
Wing Lian
c3d256271e fix wheel install glob 2023-05-26 10:37:02 -04:00
NanoCode012
46c5a44003 Merge pull request #69 from OpenAccess-AI-Collective/NanoCode012-quickstart-disable-xformers
Fix: Disable xformers for QuickStart config
2023-05-26 22:40:16 +09:00
NanoCode012
ec3c0314bf Merge pull request #65 from NanoCode012/feat/target-linear
Feat: Add `cfg.lora_target_linear`
2023-05-26 22:39:38 +09:00
NanoCode012
79560934f9 Disable formers for QuickStart config 2023-05-26 22:23:38 +09:00
NanoCode012
353cebd838 Merge pull request #68 from OpenAccess-AI-Collective/NanoCode012-patch-1
Fix: Incorrect recommendation condition
2023-05-26 22:20:31 +09:00
NanoCode012
fe0e69f4f9 Fix recommendation condition 2023-05-26 22:19:50 +09:00
Wing Lian
1fc9b44e3d fix wheel blobs in dockerfile 2023-05-26 07:40:11 -04:00
NanoCode012
919623793a Add cfg.lora_target_linear 2023-05-26 14:32:30 +09:00
Wing Lian
bbfc333a01 Merge pull request #62 from OpenAccess-AI-Collective/qlora-fixes
Qlora fixes
2023-05-26 00:28:16 -04:00
Wing Lian
a5bf838685 add logging and make sure model unloads to float16 2023-05-26 00:09:55 -04:00
Wing Lian
a4f12415a0 update readme and add typehints 2023-05-25 23:10:11 -04:00
Wing Lian
48f4c0571e fix validation for qlora merge 2023-05-25 23:02:03 -04:00
Wing Lian
1987e5cf56 qlora and 4bit check so we are able to merge and unload 2023-05-25 22:55:13 -04:00
Wing Lian
e7e1a777bd fix bool args according to python fire docs 2023-05-25 22:45:41 -04:00
Wing Lian
7b5e762be2 fix merge conflict failure, black format 2023-05-25 22:40:27 -04:00
Wing Lian
3f6017db9e qlora merge and load requires that base model isn't loaded in 4 or 8 bit 2023-05-25 22:39:13 -04:00
Wing Lian
34c99f9812 fixes to make qlora actually work 2023-05-25 22:37:23 -04:00
NanoCode012
3815c054b6 Merge pull request #61 from NanoCode012/feat/update-readme
Feat: Update readme
2023-05-26 11:27:31 +09:00
NanoCode012
85326bfbf3 Update quickstart config 2023-05-26 11:15:57 +09:00
NanoCode012
e689069afd Add xformers error 2023-05-26 11:12:03 +09:00
NanoCode012
d7d8bc739e Add strict yml 2023-05-26 11:10:59 +09:00
NanoCode012
60e32ff457 Fix shard config 2023-05-26 11:09:28 +09:00
Wing Lian
259262bf42 fix xentropy wheel name typo 2023-05-25 17:25:38 -04:00
Wing Lian
2e56203b50 another fix for shard and train split 2023-05-25 17:23:57 -04:00