Files
axolotl/examples/jamba/README.md
Wing Lian 02af0820f7 Jamba (#1451)
* fixes for larger models

* add qlora example for deepspeed

* add readme for jamba
2024-03-28 21:03:22 -04:00

6 lines
156 B
Markdown

# Jamba
qlora w/ deepspeed needs at least 2x GPUs and 35GiB VRAM per GPU
qlora single-gpu - training will start, but loss is off by an order of magnitude