Wing Lian
|
9b790d359b
|
flash attention 2
|
2023-07-21 08:17:46 -04:00 |
|
NanoCode012
|
06c61d6f13
|
Merge pull request #304 from OpenAccess-AI-Collective/NanoCode012-patch-1
Fix(readme): Improve wording for push model
|
2023-07-21 13:39:45 +09:00 |
|
Wing Lian
|
262dc29df2
|
Merge pull request #300 from OpenAccess-AI-Collective/pytorch-201
Pytorch 2.0.1
|
2023-07-21 00:28:38 -04:00 |
|
NanoCode012
|
165907fddb
|
Fix(readme): Improve wording for push model
|
2023-07-21 11:28:35 +09:00 |
|
Wing Lian
|
a032c9f452
|
fix sdp attention to use the flash/mem-efficient context manaager
|
2023-07-20 01:05:48 -04:00 |
|
Wing Lian
|
b06d3e3645
|
explicitly pin flash attention 1 to v1.0.9
|
2023-07-20 01:02:08 -04:00 |
|
Wing Lian
|
c58034d48c
|
use pytorch 2.0.1
|
2023-07-20 00:47:13 -04:00 |
|
NanoCode012
|
28fd429bcf
|
Merge pull request #293 from NanoCode012/fix/tokenize-speed
Fix(tokenizing): Use multi-core
|
2023-07-19 11:02:04 +09:00 |
|
NanoCode012
|
45ac7c4f88
|
feat: use multi-core
|
2023-07-19 10:16:54 +09:00 |
|
Wing Lian
|
edd6980dd9
|
Merge pull request #289 from OpenAccess-AI-Collective/hf_transfer
add hf_transfer to requirements for faster hf upload
|
2023-07-17 15:08:06 -04:00 |
|
Wing Lian
|
dc6d25124d
|
Merge pull request #288 from OpenAccess-AI-Collective/NanoCode012-patch-1
fix(readme): remove accelerate config
|
2023-07-17 14:46:43 -04:00 |
|
Wing Lian
|
6dd2e7d671
|
add hf_transfer to requirements for faster hf upload
|
2023-07-17 14:44:48 -04:00 |
|
NanoCode012
|
b64f411849
|
fix(readme): remove accelerate config
|
2023-07-18 01:31:02 +09:00 |
|
Wing Lian
|
03a59c1ed4
|
Merge pull request #287 from OpenAccess-AI-Collective/dataclass-fix
fix axolotl training args dataclass annotation
|
2023-07-17 06:09:23 -04:00 |
|
Wing Lian
|
ebaec3c406
|
fix axolotl training args dataclass annotation
|
2023-07-17 04:57:02 -04:00 |
|
Wing Lian
|
73e70e3996
|
Merge pull request #286 from OpenAccess-AI-Collective/logging-docker-fixes
misc fixes
|
2023-07-17 04:26:39 -04:00 |
|
Wing Lian
|
d75adb9835
|
misc fixes
|
2023-07-17 03:00:27 -04:00 |
|
Wing Lian
|
02224668c3
|
Merge pull request #283 from OpenAccess-AI-Collective/docker-git-fetch
git fetch fix for docker
|
2023-07-17 02:17:00 -04:00 |
|
Wing Lian
|
f162f3c7cc
|
set transformers cache env var in docker image
|
2023-07-16 23:03:54 -04:00 |
|
Wing Lian
|
eca3531329
|
git fetch fix for docker
|
2023-07-16 22:25:05 -04:00 |
|
Wing Lian
|
6f16c4569d
|
Merge pull request #276 from theobjectivedad/logging_enhancement
Logging update: added PID and formatting
|
2023-07-16 17:04:52 -04:00 |
|
Wing Lian
|
0bd09c077d
|
Merge pull request #280 from teknium1/main
Update requirements.txt
|
2023-07-16 16:08:58 -04:00 |
|
Wing Lian
|
469c08c9ba
|
Merge pull request #279 from NanoCode012/feat/multi-gpu-readme
Feat(readme): improve docs on multi-gpu
|
2023-07-16 16:08:37 -04:00 |
|
Wing Lian
|
334af625d0
|
Merge pull request #277 from cg123/dataset-name
Allow non-default dataset configurations
|
2023-07-16 16:08:15 -04:00 |
|
Teknium
|
273b3a3aa7
|
Update requirements.txt
Require latest git accelerate to fix saving checkpoint issue
|
2023-07-16 10:24:24 -07:00 |
|
Charles Goddard
|
3cdd8e4122
|
Add dataset name to all yaml options in README
|
2023-07-15 13:17:37 -07:00 |
|
NanoCode012
|
cf5ae6b649
|
Feat(readme): improve docs on multi-gpu
|
2023-07-16 01:07:27 +09:00 |
|
theobjectivedad
|
b1f4f7a34d
|
Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var
|
2023-07-15 12:29:35 +00:00 |
|
The Objective Dad
|
83237b8445
|
Merge branch 'OpenAccess-AI-Collective:main' into logging_enhancement
|
2023-07-15 06:16:04 -05:00 |
|
Charles Goddard
|
46032a1a1f
|
Fix formatting mistake
|
2023-07-14 20:57:27 -07:00 |
|
Charles Goddard
|
8bba64258e
|
Add example of dataset with configuration name to README
|
2023-07-14 20:46:21 -07:00 |
|
Charles Goddard
|
88089e8b32
|
Add ability to pass 'name' argument to load_dataset
|
2023-07-14 16:46:39 -07:00 |
|
NanoCode012
|
168a7a09cc
|
Merge pull request #274 from OpenAccess-AI-Collective/NanoCode012-patch-2
Feat: Set push to hub as private by default
|
2023-07-14 23:15:47 +09:00 |
|
NanoCode012
|
231031a0e1
|
Merge pull request #275 from NanoCode012/feat/safetensors
Feat: Add save_safetensors
|
2023-07-14 23:07:26 +09:00 |
|
theobjectivedad
|
9234b75cb4
|
Update log message format, IMO this is easier to read.
|
2023-07-14 07:36:21 -05:00 |
|
theobjectivedad
|
553a86b52c
|
Adding logging enhancement
|
2023-07-14 07:26:19 -05:00 |
|
NanoCode012
|
5daf7d5299
|
Merge pull request #273 from OpenAccess-AI-Collective/NanoCode012-patch-1
Feat(docs): Add model_revision arg
|
2023-07-14 21:09:50 +09:00 |
|
NanoCode012
|
5491278a79
|
Feat: Add save_safetensors
|
2023-07-14 13:21:47 +09:00 |
|
NanoCode012
|
1514739f0f
|
Set push to hub as private by default
|
2023-07-14 13:17:49 +09:00 |
|
NanoCode012
|
896c1aebcf
|
Feat(docs): Add model_revision arg
|
2023-07-14 12:56:07 +09:00 |
|
Wing Lian
|
ef17e15483
|
Merge pull request #272 from OpenAccess-AI-Collective/model-revision
support for loading a model by git revision
|
2023-07-13 23:12:00 -04:00 |
|
Wing Lian
|
69a235061b
|
support for loading a model by git revision
|
2023-07-13 22:58:25 -04:00 |
|
Wing Lian
|
687d889928
|
Merge pull request #271 from OpenAccess-AI-Collective/quadratic-warmup
Quadratic warmup
|
2023-07-10 12:48:02 -04:00 |
|
Wing Lian
|
c4cf567b55
|
Merge branch 'main' into quadratic-warmup
|
2023-07-10 12:42:12 -04:00 |
|
Wing Lian
|
c49729d2bc
|
better configuration for quadratic warmup
|
2023-07-10 11:52:59 -04:00 |
|
Wing Lian
|
13ac4d8de2
|
Merge pull request #268 from OpenAccess-AI-Collective/fix-adam-args
params are adam_*, not adamw_*
|
2023-07-08 12:33:34 -04:00 |
|
Wing Lian
|
19cf0bda99
|
params are adam_*, not adamw_*
|
2023-07-08 12:13:39 -04:00 |
|
Wing Lian
|
f74edd5b56
|
Merge pull request #266 from OpenAccess-AI-Collective/trust-remote-no-llama
|
2023-07-07 21:38:11 -04:00 |
|
Wing Lian
|
d69da99c2c
|
skip explicit model type too if using trust_remote_code
|
2023-07-07 21:33:11 -04:00 |
|
Wing Lian
|
66afb76a15
|
don't use llama if trust_remote_code is set since that needs to use AutoModel path
|
2023-07-07 21:31:02 -04:00 |
|