This website requires JavaScript.
Explore
Help
Sign In
tocmo0nlord
/
axolotl
Watch
1
Star
0
Fork
0
You've already forked axolotl
Code
Issues
Pull Requests
Actions
3
Packages
Projects
Releases
Wiki
Activity
Files
81de0efc18abc6526166bfc2996a19837461c35a
axolotl
/
configs
History
Wing Lian
81de0efc18
add support for alpaca reflect training (
#2
)
2023-04-18 08:34:05 -04:00
..
cerebras_1_3B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00
llama_7B_4bit.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_7B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
llama_65B_alpaca.yml
fix lora target module, require explicit flash attention, fix min logging steps, don't use adam8bit for int4, hash prepared datasets, support hf hub datasets
2023-04-17 18:01:12 -04:00
pythia_1_2B_alpaca.yml
4bit quantized support (wip)
2023-04-17 11:37:39 -04:00
vicuna_13B_4bit_reflect.yml
add support for alpaca reflect training (
#2
)
2023-04-18 08:34:05 -04:00