Files
axolotl/examples/magistral/think
NanoCode012 09959fac70 Feat: add Magistral Small 2509 and native mistral3 tokenizer support (#3165)
* feat: update mistral common

* feat: add mistral3processor

* fix: loading

* fix: cast pixel_values to fp32

* fix: image tensor conversion

* feat: add FA2 support for pixtral based models

* fix: update mistral small 3.1 to use native tokenizer

* fix: install tips

* fix: improve info on sample dataset files

* chore: move mistral configs into subfolders

* fix: remove unneeded patch

* fix: indent

* feat: add integration tests

* chore: move

* feat: add magistral 2509 docs and example

* fix: convert tensor to bool

* feat: expand tests

* chore: move tests
2025-09-18 15:42:20 +07:00
..

Magistral Small Thinking Fine-tuning

This guide covers fine-tuning Magistral Small 2507 with thinking capabilities using Axolotl. The thinking model enables explicit Chain-of-Thought reasoning with separate thinking and response sections.

Prerequisites

Before starting, ensure you have:

Getting Started

Run the thinking model fine-tuning:

axolotl train magistral-small-think-qlora.yaml

This config uses about 19.1 GiB VRAM.

Tips

  • Dataset uses multi-content format with type: thinking support. See Dataset Format below.
  • You cannot mix content: str and content: list[dict], otherwise, dataset loading will fail. Keep it consistent.

Dataset Format

The thinking model requires the multi-content dataset format with support for an extra role: thinking within system and assistant messages.

Example format:

{
    "messages": [
        {
            "role": "system",
            "content": [
                { "type": "text", "text": "{SYSTEM_PROMPT}"}
            ]
        },
        {
            "role": "user",
            "content": [
                { "type": "text", "text": "Solve this step by step: What is 15% of 240?"}
            ]
        },
        {
            "role": "assistant",
            "content": [
                {
                    "type": "thinking",
                    "thinking": "I need to calculate 15% of 240. First, I'll convert 15% to decimal: 0.15. Then multiply: 0.15 × 240 = 36."
                },
                {
                    "type": "text",
                    "text": "To find 15% of 240, I'll multiply 240 by 0.15:\n\n240 × 0.15 = 36\n\nTherefore, 15% of 240 is 36."
                }
            ]
        }
    ]
}

Advanced Options

The thinking section supports an optional closed parameter:

{
    "type": "thinking",
    "thinking": "Internal reasoning here...",
    "closed": true  // Default: true, controls adding the closing [/THINK] tag
}