This website requires JavaScript.
Explore
Help
Sign In
tocmo0nlord
/
axolotl
Watch
1
Star
0
Fork
0
You've already forked axolotl
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
6045345d6bcea96991ce28d25985b5c456e2a286
axolotl
/
configs
History
Wing Lian
6045345d6b
WIP large refactor to make finetune script a little more manageable (
#3
)
2023-04-18 14:01:38 -04:00
..
cerebras_1_3B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00
gpt_neox_20b.yml
WIP large refactor to make finetune script a little more manageable (
#3
)
2023-04-18 14:01:38 -04:00
llama_7B_4bit.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_7B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_65B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
pythia_1_2B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00
vicuna_13B_4bit_reflect.yml
add support for alpaca reflect training (
#2
)
2023-04-18 08:34:05 -04:00