* Lora example for Mistral on MPS backend * Add some MPS documentation * Update examples/mistral/lora-mps.yml Co-authored-by: NanoCode012 <kevinvong@rocketmail.com> * Update examples/mistral/lora-mps.yml Co-authored-by: NanoCode012 <kevinvong@rocketmail.com> * Update README.md --------- Co-authored-by: NanoCode012 <kevinvong@rocketmail.com> Co-authored-by: Wing Lian <wing.lian@gmail.com>
567 B
567 B
Mac M series support
Currently Axolotl on Mac is partially usable, many of the dependencies of Axolotl including Pytorch do not support MPS or have incomplete support.
Current support:
- Support for all models
- Full training of models
- LoRA training
- Sample packing
- FP16 and BF16 (awaiting AMP support for MPS in Pytorch)
- Tri-dao's flash-attn (until it is supported use spd_attention as an alternative)
- xformers
- bitsandbytes (meaning no 4/8 bits loading and bnb optimizers)
- qlora
- DeepSpeed
Untested:
- FSDP