Wing Lian
ba47adc24b
replace attention in the yaml config with an enum
2025-05-07 17:10:18 +07:00
Wing Lian
dd8bad06d0
remove strict=false from example yamls [skip ci] ( #2523 ) [skip ci]
2025-04-12 07:25:11 -07:00
Wing Lian
9f824ef76a
simplify the example configs to be more minimal and less daunting ( #2486 ) [skip ci]
...
* simplify the example configs to be more minimal and less daunting
* drop empty s2_attention from example yamls
2025-04-04 13:47:26 -04:00
Sunny Liu
1c14c4a15c
Add hub model id config options to all example yml files ( #2196 ) [skip ci]
...
* added hub model_id in example yml
* add hub model id to example yml
2024-12-17 11:24:30 -05:00
Wing Lian
4fde300e5f
update outputs path so that we can mount workspace to /workspace/data ( #1623 )
...
* update outputs path so that we can mount workspace to /workspace/data
* fix ln order
2024-05-15 12:44:13 -04:00
NanoCode012
a7a9a1433a
fix(examples): remove is_*_derived as it's parsed automatically ( #1297 )
2024-02-22 00:52:46 +09:00
Wing Lian
4cb7900a56
Peft lotfq ( #1222 )
...
* loftq support for lora
* fix loftq check
* update readme for loftq
* readability cleanup
* use peft main for loftq fixes, remove unnecessary special tokens
* remove unused test from older deprecation
2024-01-28 18:50:08 -05:00
Wing Lian
54d2ac155b
Mixtral fixes 20240124 ( #1192 ) [skip ci]
...
* mixtral nccl fixes
* make sure to patch for z3
2024-01-24 14:59:57 -05:00
Wing Lian
782b6a4216
set fp16 to false if bf16, update bf16: auto in example YAMLs ( #1122 ) [skip ci]
...
* set fp16 to false if bf16, update bf16: auto in example YAMLs
* unset fp16 so that it fallsback properly if bf16 isn't available
* Update README.md [skip-ci]
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com >
* test that bf16 disables fp16
---------
Co-authored-by: NanoCode012 <kevinvong@rocketmail.com >
2024-01-22 18:44:01 -05:00
Wing Lian
5f79b8242f
new evals_per_epoch and saves_per_epoch to make things cleaner ( #944 )
...
* new evals_per_epoch and saves_per_epoch to make things cleaner
* update per PR feedback
2023-12-12 15:35:23 -05:00
NanoCode012
a1da39cd48
Feat(wandb): Refactor to be more flexible ( #767 )
...
* Feat: Update to handle wandb env better
* chore: rename wandb_run_id to wandb_name
* feat: add new recommendation and update config
* fix: indent and pop disabled env if project passed
* feat: test env set for wandb and recommendation
* feat: update to use wandb_name and allow id
* chore: add info to readme
2023-12-04 22:17:25 +09:00
Wing Lian
f544ab2bed
don't compile deepspeed or bitsandbytes from source ( #837 )
2023-11-08 19:49:55 -05:00
Wing Lian
2d8def68dc
simplify by removing duplicate base_model_config ( #772 )
2023-10-23 01:42:38 -04:00
Casper
15d3a654bf
Implement fused modules ( #747 )
...
* MLP: Memory saving
* Remove RMSNorm restrictions
* Map packed weights to original
* FusedAttention module
* Simplify code
* Move fused modules
* Fix critical typo
* Split inplace
* Add FFT config
* Add validation of fused arguments
* Add fused arguments to config
* Update docs
* Fix validation logic
* Add fused modules to flash attn
* Only fuse during training
* Remove timing
* Formatting
* Formatting
* Formatting
* chore: lint
* chore: lint
* add e2e tests for fused llama
* no lora for tests
---------
Co-authored-by: Wing Lian <wing.lian@gmail.com >
2023-10-21 16:08:25 -04:00