From a6ebf57e827ff1d9c41238bf606ce7c3f7338f98 Mon Sep 17 00:00:00 2001 From: Wing Lian Date: Sun, 11 Jun 2023 10:55:32 -0400 Subject: [PATCH] fix table formatting --- README.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/README.md b/README.md index 0edeff447..8d201e739 100644 --- a/README.md +++ b/README.md @@ -16,14 +16,14 @@ ## Axolotl supports -| | fp16/fp32 | fp16/fp32 w/ lora | qlora | 4bit-quant | 4bit-quant w/flash attention | flash attention | xformers attention | -|----------|:----------|:------------------|-------|------------|------------------------------|-----------------|--------------------| -| llama | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | -| Pythia | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ❓ | -| cerebras | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | -| mpt | ✅ | ❌ | ❓ | ❌ | ❌ | ❌ | ❓ | -| falcon | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | -| gpt-j | ✅ | ✅ | ✅ | ❌ | ❌ | ❓ | ✅ | +| | fp16/fp32 | fp16/fp32 w/ lora | qlora | gptq | gptq w/ lora | gptq w/flash attention | flash attention | xformers attention | +|----------|:----------|:------------------|-------|------|:-------------|------------------------|-----------------|--------------------| +| llama | ✅ | ✅ | ✅ | ✅ | ❓ | ✅ | ✅ | ✅ | +| Pythia | ✅ | ✅ | ✅ | ❌ | ❓ | ❌ | ❌ | ❓ | +| cerebras | ✅ | ✅ | ✅ | ❌ | ❓ | ❌ | ❌ | ✅ | +| mpt | ✅ | ❌ | ❓ | ❌ | ❓ | ❌ | ❌ | ❓ | +| falcon | ✅ | ✅ | ✅ | ❌ | ❓ | ❌ | ❌ | ✅ | +| gpt-j | ✅ | ✅ | ✅ | ❌ | ❓ | ❌ | ❓ | ✅ | ## Quickstart ⚡