Angainor Development
bd3b537344
Feed cfg.inference
2023-06-09 08:59:05 +02:00
Angainor Development
813cfa4c14
WIP: Rely on cfg.inference
2023-06-09 08:49:32 +02:00
Angainor Development
193c73bce0
Fix training over existing lora
...
When training with Lora, and starting with an existing lora weights, current code produces a model with 0 trainable params and training can't work.
Adding the "is_trainable" param allows the loaded peft to be trained and fixes the bug.
2023-06-08 09:18:58 +02:00
Wing Lian
6abfd87d44
Merge pull request #158 from OpenAccess-AI-Collective/prompter-fixes
...
fix camel ai, add guanaco/oasst mapping for sharegpt
2023-06-07 11:02:30 -04:00
Wing Lian
59bb2197ed
fix camel ai, add guanaco/oasst mapping for sharegpt
2023-06-07 09:51:29 -04:00
Wing Lian
9a02e7e1ff
Merge pull request #155 from OpenAccess-AI-Collective/misc-fixes
...
new prompters, misc fixes for output dir missing using fsdp, and changing max seq len
2023-06-06 16:52:39 -04:00
Wing Lian
5b33e295bd
update docs
2023-06-05 22:48:16 -04:00
Wing Lian
4ac9e251b7
new prompters, misc fixes for output dir missing using fsdp, and changing max seq len
2023-06-05 22:41:00 -04:00
Wing Lian
328c3bce96
Merge pull request #149 from OpenAccess-AI-Collective/docker-clone-axolotl
...
clone in docker
2023-06-02 15:15:30 -04:00
Wing Lian
5cd2126439
shallow clone
2023-06-02 14:54:28 -04:00
Wing Lian
12620f3089
clone in docker
2023-06-02 14:52:50 -04:00
Wing Lian
4ab0c8b201
Merge pull request #148 from OpenAccess-AI-Collective/fix-device-load
2023-06-02 14:37:17 -04:00
Wing Lian
74ebbf4371
fix device map
2023-06-02 14:29:08 -04:00
Wing Lian
76a70fd739
Merge pull request #147 from OpenAccess-AI-Collective/winglian-rocker-images
...
Update README.md for correct image tags
2023-06-02 14:10:40 -04:00
Wing Lian
618816d4df
Update README.md for correct image tags
2023-06-02 14:10:23 -04:00
Wing Lian
91992cb8f5
Merge pull request #146 from FarisHijazi/main
...
added docker-compose file
2023-06-02 13:58:23 -04:00
FarisHijazi
84169d15b3
added docker-compose file
2023-06-02 18:17:43 +03:00
Wing Lian
ecfe8d0a1a
Merge pull request #142 from NanoCode012/feat/custom-prompt-readme
...
Feat: Add custom prompt readme and add missing prompt strategies to Readme
2023-06-02 07:21:04 -04:00
Wing Lian
eee44a3b47
Merge pull request #141 from NanoCode012/feat/lambdalabs-readme
...
Feat: Add lambdalabs instruction
2023-06-02 07:20:12 -04:00
NanoCode012
078a43eef8
Remove redundant instruction
2023-06-02 12:30:11 +09:00
NanoCode012
33e1890086
Add pygmalion
2023-06-02 12:27:51 +09:00
NanoCode012
1c38253692
Add other prompt_strategies
2023-06-02 12:24:44 +09:00
NanoCode012
496b83f778
Add short instruction for custom prompts
2023-06-02 12:16:20 +09:00
NanoCode012
ff68a95781
Add lambdalabs instruction
2023-06-02 12:09:40 +09:00
NanoCode012
288fd62431
Merge pull request #135 from NanoCode012/fix/grad-accu-readme
...
Fix: Update doc for grad_accu and add validation tests for batch size
2023-06-01 06:33:05 +09:00
NanoCode012
3c71c8debe
Update doc for grad_accu and add validation tests for batch size
2023-06-01 06:13:47 +09:00
Wing Lian
a6f5e5eaec
Merge pull request #134 from OpenAccess-AI-Collective/gas-batch-fix
...
fix batch size calculation
2023-05-31 14:24:48 -04:00
Wing Lian
5a631b305b
fix batch size calculation
2023-05-31 14:11:32 -04:00
Wing Lian
f94dd626f0
Merge pull request #130 from OpenAccess-AI-Collective/gas
...
swap batch size for gradient accumulation steps to decouple from num gpu
2023-05-31 13:03:51 -04:00
Wing Lian
5079753b7a
Merge pull request #131 from OpenAccess-AI-Collective/fix-packing-mask
...
fix packing so that concatenated sequences reset the attention
2023-05-31 13:03:37 -04:00
Wing Lian
0136f510f2
don't worry about duplicate code here
2023-05-31 12:05:43 -04:00
Wing Lian
9b8585dc70
fix packing so that concatenated sequences reset the attention
2023-05-31 11:38:52 -04:00
Wing Lian
8eb5811d4e
Merge pull request #129 from OpenAccess-AI-Collective/builder-badge
...
add badge info to readme
2023-05-31 10:37:59 -04:00
Wing Lian
e0011fdf55
Fix base builder, missing tags
2023-05-31 09:52:03 -04:00
Wing Lian
6e9e98720e
Merge pull request #127 from OpenAccess-AI-Collective/py310-docker-runpod
...
add py310 support from base image
2023-05-31 09:39:42 -04:00
Wing Lian
c2a0792680
swap batch size for gradient accumulation steps to decouple from num gpu
2023-05-31 09:38:12 -04:00
Wing Lian
b267d24a2b
add badge info to readme
2023-05-31 09:28:44 -04:00
Wing Lian
5c3f5db38b
Add files via upload
2023-05-31 09:22:54 -04:00
Wing Lian
e3d03745ba
add py310 support from base image
2023-05-31 09:07:28 -04:00
NanoCode012
fac46002d4
Merge pull request #119 from NanoCode012/feat/update-inference
...
Feat(inference): Swap to GenerationConfig
2023-05-31 14:09:18 +09:00
NanoCode012
33d40179ba
Increase max_new_tokens
...
Co-authored-by: Wing Lian <wing.lian@gmail.com >
2023-05-31 14:04:49 +09:00
Wing Lian
dcb03d6da4
Merge pull request #114 from OpenAccess-AI-Collective/accelerate-dep
...
Add accelerate dep
2023-05-31 00:47:17 -04:00
NanoCode012
0e4be625ae
Merge pull request #118 from NanoCode012/feat/torch-readme
...
Fix(readme): Fix torch missing from readme
2023-05-31 13:29:41 +09:00
NanoCode012
bdc4bd7d4e
Update README.md
2023-05-31 13:24:28 +09:00
Wing Lian
2d0ba3b818
Merge pull request #124 from OpenAccess-AI-Collective/xformers-fix
...
copy xformers attn from ooba since we removed dep on alpaca_lora_4bit
2023-05-31 00:11:40 -04:00
Wing Lian
c7021e191f
Merge pull request #120 from OpenAccess-AI-Collective/model-from-path
...
split up llama model loading so config can be loaded from base config and models can be loaded from a path
2023-05-31 00:08:38 -04:00
Wing Lian
c56818b119
don't worry about dupes
2023-05-31 00:06:47 -04:00
Wing Lian
2675fb756e
update readme for SDP
2023-05-31 00:04:54 -04:00
Wing Lian
1076bcbbca
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
...
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com >
2023-05-31 00:00:19 -04:00
Wing Lian
2daa6835f0
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
...
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com >
2023-05-30 23:59:05 -04:00