--- title: "Installation" format: html: toc: true toc-depth: 3 number-sections: true execute: enabled: false --- This guide covers all the ways you can install and set up Axolotl for your environment. ## Requirements {#sec-requirements} - NVIDIA GPU (Ampere architecture or newer for `bf16` and Flash Attention) or AMD GPU - Python ≥3.11 - PyTorch ≥2.6.0 ## Installation Methods {#sec-installation-methods} ::: {.callout-important} Please make sure to have Pytorch installed before installing Axolotl in your local environment. Follow the instructions at: [https://pytorch.org/get-started/locally/](https://pytorch.org/get-started/locally/) ::: ::: {.callout-important} For Blackwell GPUs, please use Pytorch 2.7.0 and CUDA 12.8. ::: ### uv Installation (Recommended) {#sec-uv-quick} ```{.bash} # Install uv if not already installed curl -LsSf https://astral.sh/uv/install.sh | sh # Add Axolotl to a project (recommended) uv init my-project && cd my-project uv add axolotl uv pip install flash-attn --no-build-isolation source .venv/bin/activate ``` For a quick one-off install without creating a project: ```{.bash} uv pip install axolotl uv pip install flash-attn --no-build-isolation ``` ### pip Installation {#sec-pypi} ```{.bash} pip install --no-build-isolation axolotl[deepspeed] pip install --no-build-isolation flash-attn ``` We use `--no-build-isolation` in order to detect the installed PyTorch version (if installed) in order not to clobber it, and so that we set the correct version of dependencies that are specific to the PyTorch version or other installed co-dependencies. Flash Attention is resolved separately so it can be built against the environment configured by the previous step. ### Advanced uv Installation {#sec-uv} uv is a fast, reliable Python package installer and resolver built in Rust. It offers significant performance improvements over pip and provides better dependency resolution, making it an excellent choice for complex environments. Install uv if not already installed ```{.bash} curl -LsSf https://astral.sh/uv/install.sh | sh source $HOME/.local/bin/env ``` Choose your CUDA version to use with PyTorch; e.g. `cu124`, `cu126`, `cu128`, then create the venv and activate ```{.bash} export UV_TORCH_BACKEND=cu126 uv venv --no-project --relocatable source .venv/bin/activate ``` Install PyTorch - PyTorch 2.6.0 recommended ```{.bash} uv pip install torch==2.6.0 uv pip install awscli pydantic ``` Install axolotl from PyPi ```{.bash} uv pip install --no-build-isolation axolotl[deepspeed] # optionally install with vLLM if you're using torch==2.6.0 and want to train w/ GRPO # uv pip install --no-build-isolation axolotl[deepspeed,vllm] uv pip install flash-attn --no-build-isolation ``` ### Edge/Development Build {#sec-edge-build} For the latest features between releases: #### Using uv (recommended) ```{.bash} git clone https://github.com/axolotl-ai-cloud/axolotl.git cd axolotl curl -LsSf https://astral.sh/uv/install.sh | sh # If not already installed uv sync uv pip install flash-attn --no-build-isolation ``` #### Using pip ```{.bash} git clone https://github.com/axolotl-ai-cloud/axolotl.git cd axolotl pip install --no-build-isolation -e '.[deepspeed]' pip install --no-build-isolation flash-attn ``` ### Docker {#sec-docker} ```{.bash} docker run --gpus '"all"' --rm -it axolotlai/axolotl:main-latest ``` For development with Docker: ```{.bash} docker compose up -d ``` ::: {.callout-tip} ### Advanced Docker Configuration ```{.bash} docker run --privileged --gpus '"all"' --shm-size 10g --rm -it \ --name axolotl --ipc=host \ --ulimit memlock=-1 --ulimit stack=67108864 \ --mount type=bind,src="${PWD}",target=/workspace/axolotl \ -v ${HOME}/.cache/huggingface:/root/.cache/huggingface \ axolotlai/axolotl:main-latest ``` ::: ::: {.callout-important} For Blackwell GPUs, please use `axolotlai/axolotl:main-py3.11-cu128-2.7.0` or the cloud variant `axolotlai/axolotl-cloud:main-py3.11-cu128-2.7.0`. ::: Please refer to the [Docker documentation](docker.qmd) for more information on the different Docker images that are available. ## Cloud Environments {#sec-cloud} ### Cloud GPU Providers {#sec-cloud-gpu} For providers supporting Docker: - Use `axolotlai/axolotl-cloud:main-latest` - Available on: - [RunPod](https://runpod.io/gsc?template=v2ickqhz9s&ref=6i7fkpdz) - [Vast.ai](https://cloud.vast.ai?ref_id=62897&template_id=bdd4a49fa8bce926defc99471864cace&utm_source=axolotl&utm_medium=partner&utm_campaign=template_launch_july2025&utm_content=docs_link) - [PRIME Intellect](https://app.primeintellect.ai/dashboard/create-cluster?image=axolotl&location=Cheapest&security=Cheapest&show_spot=true) - [Modal](https://www.modal.com?utm_source=github&utm_medium=github&utm_campaign=axolotl) - [Novita](https://novita.ai/gpus-console?templateId=311) - [JarvisLabs.ai](https://jarvislabs.ai/templates/axolotl) - [Latitude.sh](https://latitude.sh/blueprint/989e0e79-3bf6-41ea-a46b-1f246e309d5c) ### Google Colab {#sec-colab} [![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/axolotl-ai-cloud/axolotl/blob/main/examples/colab-notebooks/colab-axolotl-example.ipynb#scrollTo=msOCO4NRmRLa) ## Platform-Specific Instructions {#sec-platform-specific} ### macOS {#sec-macos} ```{.bash} uv pip install --no-build-isolation -e '.' ``` See @sec-troubleshooting for Mac-specific issues. ### Windows {#sec-windows} ::: {.callout-important} We recommend using WSL2 (Windows Subsystem for Linux) or Docker. ::: ## Environment Managers {#sec-env-managers} ### Conda/Pip venv {#sec-conda} 1. Install Python ≥3.11 2. Install PyTorch: https://pytorch.org/get-started/locally/ 3. Install Axolotl: ```{.bash} # Option A: add Axolotl to the environment uv add axolotl uv pip install flash-attn --no-build-isolation # Option B: quick install uv pip install axolotl uv pip install flash-attn --no-build-isolation ``` 4. (Optional) Login to Hugging Face: ```{.bash} huggingface-cli login ``` ## Troubleshooting {#sec-troubleshooting} If you encounter installation issues, see our [FAQ](faq.qmd) and [Debugging Guide](debugging.qmd).