* bump transformers and set roundup_power2_divisions for more VRAM improvements * support for low bit optimizers from torch ao * fix check for alternate optimizers and use nous models on hf for llama3 * add missing check for ao_adamw_fp8 * fix check when using custom optimizers w adamw
20 lines
453 B
Plaintext
20 lines
453 B
Plaintext
---
|
|
title: "PyTorch ao"
|
|
description: "Custom data types and layouts for training and inference"
|
|
---
|
|
|
|
### Installation
|
|
|
|
Stable Release from the PyTorch index
|
|
|
|
```bash
|
|
pip install torchao --extra-index-url https://download.pytorch.org/whl/cu121 # full options are cpu/cu118/cu121/cu124
|
|
```
|
|
|
|
|
|
Nightly release
|
|
|
|
```bash
|
|
pip install --pre torchao-nightly --index-url https://download.pytorch.org/whl/nightly/cu121 # full options are cpu/cu118/cu121/cu124
|
|
```
|