This website requires JavaScript.
Explore
Help
Sign In
tocmo0nlord
/
axolotl
Watch
1
Star
0
Fork
0
You've already forked axolotl
Code
Issues
Pull Requests
Actions
3
Packages
Projects
Releases
Wiki
Activity
Files
e1076430ff2fe886bac49c0191bd1b29cddb5421
axolotl
/
configs
History
Wing Lian
87e073d0de
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
..
cerebras_1_3B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00
llama_7B_4bit.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_7B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_65B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
pythia_1_2B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00