From 2e71ff03a6bd7ad74e72a69a90b00f9075d17933 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Aleksa=20Gordi=C4=87?= Date: Fri, 27 Oct 2023 14:24:04 +0100 Subject: [PATCH] Add docker advanced instruction to README (#792) --- README.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) diff --git a/README.md b/README.md index 596641219..add56caf5 100644 --- a/README.md +++ b/README.md @@ -114,6 +114,25 @@ accelerate launch -m axolotl.cli.inference examples/openllama-3b/lora.yml \ docker compose up -d ``` +
+ + Docker advanced + + A more powerful Docker command to run would be this: + + ```bash + docker run --gpus '"all"' --rm -it --name axolotl --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 --mount type=volume,src=axolotl,target=/workspace/axolotl -v ${HOME}/.cache/huggingface:/root/.cache/huggingface winglian/axolotl:main-py3.10-cu118-2.0.1 + ``` + + It additionally: + * Prevents memory issues when running e.g. deepspeed (e.g. you could hit SIGBUS/signal 7 error) through `--ipc` and `--ulimit` args. + * Persists the downloaded HF data (models etc.) and your modifications to axolotl code through `--mount`/`-v` args. + * The `--name` argument simply makes it easier to refer to the container in vscode (`Dev Containers: Attach to Running Container...`) or in your terminal. + + [More information on nvidia website](https://docs.nvidia.com/deeplearning/frameworks/user-guide/index.html#setincshmem) + +
+ #### Conda/Pip venv 1. Install python >=**3.9**