Built site for gh-pages
This commit is contained in:
@@ -477,6 +477,7 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<ul class="collapse">
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.apply_lora_kernel_patches" id="toc-axolotl.monkeypatch.lora_kernels.apply_lora_kernel_patches" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.apply_lora_kernel_patches">apply_lora_kernel_patches</a></li>
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.get_attention_cls_from_config" id="toc-axolotl.monkeypatch.lora_kernels.get_attention_cls_from_config" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.get_attention_cls_from_config">get_attention_cls_from_config</a></li>
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.get_layers" id="toc-axolotl.monkeypatch.lora_kernels.get_layers" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.get_layers">get_layers</a></li>
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.original_apply_o" id="toc-axolotl.monkeypatch.lora_kernels.original_apply_o" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.original_apply_o">original_apply_o</a></li>
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.original_apply_qkv" id="toc-axolotl.monkeypatch.lora_kernels.original_apply_qkv" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.original_apply_qkv">original_apply_qkv</a></li>
|
||||
<li><a href="#axolotl.monkeypatch.lora_kernels.patch_self_attn_lora" id="toc-axolotl.monkeypatch.lora_kernels.patch_self_attn_lora" class="nav-link" data-scroll-target="#axolotl.monkeypatch.lora_kernels.patch_self_attn_lora">patch_self_attn_lora</a></li>
|
||||
@@ -536,14 +537,18 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<td>Get the appropriate attention class by inspecting the model config.</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><a href="#axolotl.monkeypatch.lora_kernels.get_layers">get_layers</a></td>
|
||||
<td>Get the layers of the model. Handles text-only and multimodal models.</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><a href="#axolotl.monkeypatch.lora_kernels.original_apply_o">original_apply_o</a></td>
|
||||
<td>Original implementation of output projection without optimizations.</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<tr class="odd">
|
||||
<td><a href="#axolotl.monkeypatch.lora_kernels.original_apply_qkv">original_apply_qkv</a></td>
|
||||
<td>Original implementation of QKV projection without optimizations.</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<tr class="even">
|
||||
<td><a href="#axolotl.monkeypatch.lora_kernels.patch_self_attn_lora">patch_self_attn_lora</a></td>
|
||||
<td>Given an <code>axolotl</code> config, this method patches the inferred attention class forward</td>
|
||||
</tr>
|
||||
@@ -740,13 +745,58 @@ the standard transformers naming convention.</p>
|
||||
</table>
|
||||
</section>
|
||||
</section>
|
||||
<section id="axolotl.monkeypatch.lora_kernels.original_apply_o" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="axolotl.monkeypatch.lora_kernels.original_apply_o">original_apply_o</h3>
|
||||
<div class="sourceCode" id="cb4"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb4-1"><a href="#cb4-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.original_apply_o(<span class="va">self</span>, hidden_states)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Original implementation of output projection without optimizations.</p>
|
||||
<section id="axolotl.monkeypatch.lora_kernels.get_layers" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="axolotl.monkeypatch.lora_kernels.get_layers">get_layers</h3>
|
||||
<div class="sourceCode" id="cb4"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb4-1"><a href="#cb4-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.get_layers(model)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Get the layers of the model. Handles text-only and multimodal models.</p>
|
||||
<section id="parameters-2" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-2">Parameters</h4>
|
||||
<table class="caption-top table">
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Name</th>
|
||||
<th>Type</th>
|
||||
<th>Description</th>
|
||||
<th>Default</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>model</td>
|
||||
<td>PeftModelForCausalLM</td>
|
||||
<td>A PEFT model.</td>
|
||||
<td><em>required</em></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</section>
|
||||
<section id="returns-2" class="level4 doc-section doc-section-returns">
|
||||
<h4 class="doc-section doc-section-returns anchored" data-anchor-id="returns-2">Returns</h4>
|
||||
<table class="caption-top table">
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Name</th>
|
||||
<th>Type</th>
|
||||
<th>Description</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td></td>
|
||||
<td>list[nn.Module]</td>
|
||||
<td>A list of layers.</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</section>
|
||||
</section>
|
||||
<section id="axolotl.monkeypatch.lora_kernels.original_apply_o" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="axolotl.monkeypatch.lora_kernels.original_apply_o">original_apply_o</h3>
|
||||
<div class="sourceCode" id="cb5"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb5-1"><a href="#cb5-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.original_apply_o(<span class="va">self</span>, hidden_states)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Original implementation of output projection without optimizations.</p>
|
||||
<section id="parameters-3" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-3">Parameters</h4>
|
||||
<table class="caption-top table">
|
||||
<colgroup>
|
||||
<col style="width: 14%">
|
||||
<col style="width: 13%">
|
||||
@@ -777,8 +827,8 @@ the standard transformers naming convention.</p>
|
||||
</tbody>
|
||||
</table>
|
||||
</section>
|
||||
<section id="returns-2" class="level4 doc-section doc-section-returns">
|
||||
<h4 class="doc-section doc-section-returns anchored" data-anchor-id="returns-2">Returns</h4>
|
||||
<section id="returns-3" class="level4 doc-section doc-section-returns">
|
||||
<h4 class="doc-section doc-section-returns anchored" data-anchor-id="returns-3">Returns</h4>
|
||||
<table class="caption-top table">
|
||||
<thead>
|
||||
<tr class="header">
|
||||
@@ -799,10 +849,10 @@ the standard transformers naming convention.</p>
|
||||
</section>
|
||||
<section id="axolotl.monkeypatch.lora_kernels.original_apply_qkv" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="axolotl.monkeypatch.lora_kernels.original_apply_qkv">original_apply_qkv</h3>
|
||||
<div class="sourceCode" id="cb5"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb5-1"><a href="#cb5-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.original_apply_qkv(<span class="va">self</span>, hidden_states)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb6"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb6-1"><a href="#cb6-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.original_apply_qkv(<span class="va">self</span>, hidden_states)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Original implementation of QKV projection without optimizations.</p>
|
||||
<section id="parameters-3" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-3">Parameters</h4>
|
||||
<section id="parameters-4" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-4">Parameters</h4>
|
||||
<table class="caption-top table">
|
||||
<colgroup>
|
||||
<col style="width: 15%">
|
||||
@@ -834,8 +884,8 @@ the standard transformers naming convention.</p>
|
||||
</tbody>
|
||||
</table>
|
||||
</section>
|
||||
<section id="returns-3" class="level4 doc-section doc-section-returns">
|
||||
<h4 class="doc-section doc-section-returns anchored" data-anchor-id="returns-3">Returns</h4>
|
||||
<section id="returns-4" class="level4 doc-section doc-section-returns">
|
||||
<h4 class="doc-section doc-section-returns anchored" data-anchor-id="returns-4">Returns</h4>
|
||||
<table class="caption-top table">
|
||||
<colgroup>
|
||||
<col style="width: 4%">
|
||||
@@ -861,13 +911,13 @@ the standard transformers naming convention.</p>
|
||||
</section>
|
||||
<section id="axolotl.monkeypatch.lora_kernels.patch_self_attn_lora" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="axolotl.monkeypatch.lora_kernels.patch_self_attn_lora">patch_self_attn_lora</h3>
|
||||
<div class="sourceCode" id="cb6"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb6-1"><a href="#cb6-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.patch_self_attn_lora(cfg)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb7"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb7-1"><a href="#cb7-1" aria-hidden="true" tabindex="-1"></a>monkeypatch.lora_kernels.patch_self_attn_lora(cfg)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Given an <code>axolotl</code> config, this method patches the inferred attention class forward
|
||||
pass with optimized LoRA implementations.</p>
|
||||
<p>It modifies the attention class to use optimized QKV and output projections. The
|
||||
original implementation is preserved and can be restored if needed.</p>
|
||||
<section id="parameters-4" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-4">Parameters</h4>
|
||||
<section id="parameters-5" class="level4 doc-section doc-section-parameters">
|
||||
<h4 class="doc-section doc-section-parameters anchored" data-anchor-id="parameters-5">Parameters</h4>
|
||||
<table class="caption-top table">
|
||||
<colgroup>
|
||||
<col style="width: 9%">
|
||||
|
||||
@@ -562,8 +562,12 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
</section>
|
||||
<section id="usage" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage">Usage</h3>
|
||||
<div class="sourceCode" id="cb3"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb3-1"><a href="#cb3-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb3-2"><a href="#cb3-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.cut_cross_entropy.CutCrossEntropyPlugin</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p><strong>NOTE</strong>: If you are training a VLM model, please use older version of Axolotl as upstream has applied a major VLM refactor, and our patches have not been updated yet.</p>
|
||||
<div class="sourceCode" id="cb3"><pre class="sourceCode bash code-with-copy"><code class="sourceCode bash"><span id="cb3-1"><a href="#cb3-1" aria-hidden="true" tabindex="-1"></a><span class="fu">git</span> checkout 787880215b3ab32ccaf81c1b2e9588c6f3e6e764</span>
|
||||
<span id="cb3-2"><a href="#cb3-2" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb3-3"><a href="#cb3-3" aria-hidden="true" tabindex="-1"></a><span class="ex">pip3</span> install <span class="at">--no-build-isolation</span> <span class="at">-e</span> .</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb4"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb4-1"><a href="#cb4-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb4-2"><a href="#cb4-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.cut_cross_entropy.CutCrossEntropyPlugin</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</section>
|
||||
<section id="supported-models" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="supported-models">Supported Models</h3>
|
||||
@@ -593,17 +597,17 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
</section>
|
||||
<section id="citation" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="citation">Citation</h3>
|
||||
<div class="sourceCode" id="cb4"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb4-1"><a href="#cb4-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">wijmans2024cut</span>,</span>
|
||||
<span id="cb4-2"><a href="#cb4-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span> = {Erik Wijmans and</span>
|
||||
<span id="cb4-3"><a href="#cb4-3" aria-hidden="true" tabindex="-1"></a> Brody Huval and</span>
|
||||
<span id="cb4-4"><a href="#cb4-4" aria-hidden="true" tabindex="-1"></a> Alexander Hertzberg and</span>
|
||||
<span id="cb4-5"><a href="#cb4-5" aria-hidden="true" tabindex="-1"></a> Vladlen Koltun and</span>
|
||||
<span id="cb4-6"><a href="#cb4-6" aria-hidden="true" tabindex="-1"></a> Philipp Kr<span class="ch">\"</span>ahenb<span class="ch">\"</span>uhl},</span>
|
||||
<span id="cb4-7"><a href="#cb4-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span> = {Cut Your Losses in Large-Vocabulary Language Models},</span>
|
||||
<span id="cb4-8"><a href="#cb4-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span> = {arXiv},</span>
|
||||
<span id="cb4-9"><a href="#cb4-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span> = {2024},</span>
|
||||
<span id="cb4-10"><a href="#cb4-10" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span> = {https://arxiv.org/abs/2411.09009},</span>
|
||||
<span id="cb4-11"><a href="#cb4-11" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb5"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb5-1"><a href="#cb5-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">wijmans2024cut</span>,</span>
|
||||
<span id="cb5-2"><a href="#cb5-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span> = {Erik Wijmans and</span>
|
||||
<span id="cb5-3"><a href="#cb5-3" aria-hidden="true" tabindex="-1"></a> Brody Huval and</span>
|
||||
<span id="cb5-4"><a href="#cb5-4" aria-hidden="true" tabindex="-1"></a> Alexander Hertzberg and</span>
|
||||
<span id="cb5-5"><a href="#cb5-5" aria-hidden="true" tabindex="-1"></a> Vladlen Koltun and</span>
|
||||
<span id="cb5-6"><a href="#cb5-6" aria-hidden="true" tabindex="-1"></a> Philipp Kr<span class="ch">\"</span>ahenb<span class="ch">\"</span>uhl},</span>
|
||||
<span id="cb5-7"><a href="#cb5-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span> = {Cut Your Losses in Large-Vocabulary Language Models},</span>
|
||||
<span id="cb5-8"><a href="#cb5-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span> = {arXiv},</span>
|
||||
<span id="cb5-9"><a href="#cb5-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span> = {2024},</span>
|
||||
<span id="cb5-10"><a href="#cb5-10" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span> = {https://arxiv.org/abs/2411.09009},</span>
|
||||
<span id="cb5-11"><a href="#cb5-11" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/cut_cross_entropy">here</a></p>
|
||||
</section>
|
||||
</section>
|
||||
@@ -612,20 +616,20 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<p>See https://github.com/ironjr/grokfast</p>
|
||||
<section id="usage-1" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-1">Usage</h3>
|
||||
<div class="sourceCode" id="cb5"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb5-1"><a href="#cb5-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb5-2"><a href="#cb5-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.grokfast.GrokfastPlugin</span></span>
|
||||
<span id="cb5-3"><a href="#cb5-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb5-4"><a href="#cb5-4" aria-hidden="true" tabindex="-1"></a><span class="fu">grokfast_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">2.0</span></span>
|
||||
<span id="cb5-5"><a href="#cb5-5" aria-hidden="true" tabindex="-1"></a><span class="fu">grokfast_lamb</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.98</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb6"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb6-1"><a href="#cb6-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb6-2"><a href="#cb6-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.grokfast.GrokfastPlugin</span></span>
|
||||
<span id="cb6-3"><a href="#cb6-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb6-4"><a href="#cb6-4" aria-hidden="true" tabindex="-1"></a><span class="fu">grokfast_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">2.0</span></span>
|
||||
<span id="cb6-5"><a href="#cb6-5" aria-hidden="true" tabindex="-1"></a><span class="fu">grokfast_lamb</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.98</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</section>
|
||||
<section id="citation-1" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="citation-1">Citation</h3>
|
||||
<div class="sourceCode" id="cb6"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb6-1"><a href="#cb6-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">lee2024grokfast</span>,</span>
|
||||
<span id="cb6-2"><a href="#cb6-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={{Grokfast}: Accelerated Grokking by Amplifying Slow Gradients},</span>
|
||||
<span id="cb6-3"><a href="#cb6-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Lee, Jaerin and Kang, Bong Gyun and Kim, Kihoon and Lee, Kyoung Mu},</span>
|
||||
<span id="cb6-4"><a href="#cb6-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span>={arXiv preprint arXiv:2405.20233},</span>
|
||||
<span id="cb6-5"><a href="#cb6-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024}</span>
|
||||
<span id="cb6-6"><a href="#cb6-6" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb7"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb7-1"><a href="#cb7-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">lee2024grokfast</span>,</span>
|
||||
<span id="cb7-2"><a href="#cb7-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={{Grokfast}: Accelerated Grokking by Amplifying Slow Gradients},</span>
|
||||
<span id="cb7-3"><a href="#cb7-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Lee, Jaerin and Kang, Bong Gyun and Kim, Kihoon and Lee, Kyoung Mu},</span>
|
||||
<span id="cb7-4"><a href="#cb7-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span>={arXiv preprint arXiv:2405.20233},</span>
|
||||
<span id="cb7-5"><a href="#cb7-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024}</span>
|
||||
<span id="cb7-6"><a href="#cb7-6" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/grokfast">here</a></p>
|
||||
</section>
|
||||
</section>
|
||||
@@ -633,21 +637,21 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<h2 class="anchored" data-anchor-id="knowledge-distillation-kd">Knowledge Distillation (KD)</h2>
|
||||
<section id="usage-2" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-2">Usage</h3>
|
||||
<div class="sourceCode" id="cb7"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb7-1"><a href="#cb7-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb7-2"><a href="#cb7-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> </span><span class="st">"axolotl.integrations.kd.KDPlugin"</span></span>
|
||||
<span id="cb7-3"><a href="#cb7-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb7-4"><a href="#cb7-4" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_trainer</span><span class="kw">:</span><span class="at"> </span><span class="ch">True</span></span>
|
||||
<span id="cb7-5"><a href="#cb7-5" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_ce_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.1</span></span>
|
||||
<span id="cb7-6"><a href="#cb7-6" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.9</span></span>
|
||||
<span id="cb7-7"><a href="#cb7-7" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_temperature</span><span class="kw">:</span><span class="at"> </span><span class="fl">1.0</span></span>
|
||||
<span id="cb7-8"><a href="#cb7-8" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb7-9"><a href="#cb7-9" aria-hidden="true" tabindex="-1"></a><span class="fu">torch_compile</span><span class="kw">:</span><span class="at"> </span><span class="ch">True</span><span class="co"> # torch>=2.5.1, recommended to reduce vram</span></span>
|
||||
<span id="cb7-10"><a href="#cb7-10" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb7-11"><a href="#cb7-11" aria-hidden="true" tabindex="-1"></a><span class="fu">datasets</span><span class="kw">:</span></span>
|
||||
<span id="cb7-12"><a href="#cb7-12" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> </span><span class="fu">path</span><span class="kw">:</span><span class="at"> ...</span></span>
|
||||
<span id="cb7-13"><a href="#cb7-13" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">type</span><span class="kw">:</span><span class="at"> </span><span class="st">"axolotl.integrations.kd.chat_template"</span></span>
|
||||
<span id="cb7-14"><a href="#cb7-14" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">field_messages</span><span class="kw">:</span><span class="at"> </span><span class="st">"messages_combined"</span></span>
|
||||
<span id="cb7-15"><a href="#cb7-15" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">logprobs_field</span><span class="kw">:</span><span class="at"> </span><span class="st">"llm_text_generation_vllm_logprobs"</span><span class="co"> # for kd only, field of logprobs</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb8"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb8-1"><a href="#cb8-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb8-2"><a href="#cb8-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> </span><span class="st">"axolotl.integrations.kd.KDPlugin"</span></span>
|
||||
<span id="cb8-3"><a href="#cb8-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb8-4"><a href="#cb8-4" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_trainer</span><span class="kw">:</span><span class="at"> </span><span class="ch">True</span></span>
|
||||
<span id="cb8-5"><a href="#cb8-5" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_ce_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.1</span></span>
|
||||
<span id="cb8-6"><a href="#cb8-6" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_alpha</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.9</span></span>
|
||||
<span id="cb8-7"><a href="#cb8-7" aria-hidden="true" tabindex="-1"></a><span class="fu">kd_temperature</span><span class="kw">:</span><span class="at"> </span><span class="fl">1.0</span></span>
|
||||
<span id="cb8-8"><a href="#cb8-8" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb8-9"><a href="#cb8-9" aria-hidden="true" tabindex="-1"></a><span class="fu">torch_compile</span><span class="kw">:</span><span class="at"> </span><span class="ch">True</span><span class="co"> # torch>=2.5.1, recommended to reduce vram</span></span>
|
||||
<span id="cb8-10"><a href="#cb8-10" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb8-11"><a href="#cb8-11" aria-hidden="true" tabindex="-1"></a><span class="fu">datasets</span><span class="kw">:</span></span>
|
||||
<span id="cb8-12"><a href="#cb8-12" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> </span><span class="fu">path</span><span class="kw">:</span><span class="at"> ...</span></span>
|
||||
<span id="cb8-13"><a href="#cb8-13" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">type</span><span class="kw">:</span><span class="at"> </span><span class="st">"axolotl.integrations.kd.chat_template"</span></span>
|
||||
<span id="cb8-14"><a href="#cb8-14" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">field_messages</span><span class="kw">:</span><span class="at"> </span><span class="st">"messages_combined"</span></span>
|
||||
<span id="cb8-15"><a href="#cb8-15" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">logprobs_field</span><span class="kw">:</span><span class="at"> </span><span class="st">"llm_text_generation_vllm_logprobs"</span><span class="co"> # for kd only, field of logprobs</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>An example dataset can be found at <a href="https://huggingface.co/datasets/axolotl-ai-co/evolkit-logprobs-pipeline-75k-v2-sample"><code>axolotl-ai-co/evolkit-logprobs-pipeline-75k-v2-sample</code></a></p>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/kd">here</a></p>
|
||||
</section>
|
||||
@@ -663,13 +667,13 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<p>See https://github.com/linkedin/Liger-Kernel</p>
|
||||
<section id="usage-3" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-3">Usage</h3>
|
||||
<div class="sourceCode" id="cb8"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb8-1"><a href="#cb8-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb8-2"><a href="#cb8-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.liger.LigerPlugin</span></span>
|
||||
<span id="cb8-3"><a href="#cb8-3" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_rope</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb8-4"><a href="#cb8-4" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_rms_norm</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb8-5"><a href="#cb8-5" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_glu_activation</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb8-6"><a href="#cb8-6" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_layer_norm</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb8-7"><a href="#cb8-7" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_fused_linear_cross_entropy</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb9"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb9-1"><a href="#cb9-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb9-2"><a href="#cb9-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.liger.LigerPlugin</span></span>
|
||||
<span id="cb9-3"><a href="#cb9-3" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_rope</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb9-4"><a href="#cb9-4" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_rms_norm</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb9-5"><a href="#cb9-5" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_glu_activation</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb9-6"><a href="#cb9-6" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_layer_norm</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span>
|
||||
<span id="cb9-7"><a href="#cb9-7" aria-hidden="true" tabindex="-1"></a><span class="fu">liger_fused_linear_cross_entropy</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</section>
|
||||
<section id="supported-models-1" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="supported-models-1">Supported Models</h3>
|
||||
@@ -695,16 +699,16 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
</section>
|
||||
<section id="citation-2" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="citation-2">Citation</h3>
|
||||
<div class="sourceCode" id="cb9"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb9-1"><a href="#cb9-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">hsu2024ligerkernelefficienttriton</span>,</span>
|
||||
<span id="cb9-2"><a href="#cb9-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={Liger Kernel: Efficient Triton Kernels for LLM Training},</span>
|
||||
<span id="cb9-3"><a href="#cb9-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Pin-Lun Hsu and Yun Dai and Vignesh Kothapalli and Qingquan Song and Shao Tang and Siyu Zhu and Steven Shimizu and Shivam Sahni and Haowen Ning and Yanning Chen},</span>
|
||||
<span id="cb9-4"><a href="#cb9-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024},</span>
|
||||
<span id="cb9-5"><a href="#cb9-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">eprint</span>={2410.10989},</span>
|
||||
<span id="cb9-6"><a href="#cb9-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">archivePrefix</span>={arXiv},</span>
|
||||
<span id="cb9-7"><a href="#cb9-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">primaryClass</span>={cs.LG},</span>
|
||||
<span id="cb9-8"><a href="#cb9-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span>={https://arxiv.org/abs/2410.10989},</span>
|
||||
<span id="cb9-9"><a href="#cb9-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span>={arXiv preprint arXiv:2410.10989},</span>
|
||||
<span id="cb9-10"><a href="#cb9-10" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb10"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb10-1"><a href="#cb10-1" aria-hidden="true" tabindex="-1"></a><span class="va">@article</span>{<span class="ot">hsu2024ligerkernelefficienttriton</span>,</span>
|
||||
<span id="cb10-2"><a href="#cb10-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={Liger Kernel: Efficient Triton Kernels for LLM Training},</span>
|
||||
<span id="cb10-3"><a href="#cb10-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Pin-Lun Hsu and Yun Dai and Vignesh Kothapalli and Qingquan Song and Shao Tang and Siyu Zhu and Steven Shimizu and Shivam Sahni and Haowen Ning and Yanning Chen},</span>
|
||||
<span id="cb10-4"><a href="#cb10-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024},</span>
|
||||
<span id="cb10-5"><a href="#cb10-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">eprint</span>={2410.10989},</span>
|
||||
<span id="cb10-6"><a href="#cb10-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">archivePrefix</span>={arXiv},</span>
|
||||
<span id="cb10-7"><a href="#cb10-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">primaryClass</span>={cs.LG},</span>
|
||||
<span id="cb10-8"><a href="#cb10-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span>={https://arxiv.org/abs/2410.10989},</span>
|
||||
<span id="cb10-9"><a href="#cb10-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">journal</span>={arXiv preprint arXiv:2410.10989},</span>
|
||||
<span id="cb10-10"><a href="#cb10-10" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/liger">here</a></p>
|
||||
</section>
|
||||
</section>
|
||||
@@ -714,29 +718,29 @@ gtag('config', 'G-9KYCVJBNMQ', { 'anonymize_ip': true});
|
||||
<p>See https://github.com/EleutherAI/lm-evaluation-harness</p>
|
||||
<section id="usage-4" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-4">Usage</h3>
|
||||
<div class="sourceCode" id="cb10"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb10-1"><a href="#cb10-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb10-2"><a href="#cb10-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.lm_eval.LMEvalPlugin</span></span>
|
||||
<span id="cb10-3"><a href="#cb10-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb10-4"><a href="#cb10-4" aria-hidden="true" tabindex="-1"></a><span class="fu">lm_eval_tasks</span><span class="kw">:</span></span>
|
||||
<span id="cb10-5"><a href="#cb10-5" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> gsm8k</span></span>
|
||||
<span id="cb10-6"><a href="#cb10-6" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> hellaswag</span></span>
|
||||
<span id="cb10-7"><a href="#cb10-7" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> arc_easy</span></span>
|
||||
<span id="cb10-8"><a href="#cb10-8" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb10-9"><a href="#cb10-9" aria-hidden="true" tabindex="-1"></a><span class="fu">lm_eval_batch_size</span><span class="kw">:</span><span class="co"> # Batch size for evaluation</span></span>
|
||||
<span id="cb10-10"><a href="#cb10-10" aria-hidden="true" tabindex="-1"></a><span class="fu">output_dir</span><span class="kw">:</span><span class="co"> # Directory to save evaluation results</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb11"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb11-1"><a href="#cb11-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb11-2"><a href="#cb11-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.lm_eval.LMEvalPlugin</span></span>
|
||||
<span id="cb11-3"><a href="#cb11-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb11-4"><a href="#cb11-4" aria-hidden="true" tabindex="-1"></a><span class="fu">lm_eval_tasks</span><span class="kw">:</span></span>
|
||||
<span id="cb11-5"><a href="#cb11-5" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> gsm8k</span></span>
|
||||
<span id="cb11-6"><a href="#cb11-6" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> hellaswag</span></span>
|
||||
<span id="cb11-7"><a href="#cb11-7" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> arc_easy</span></span>
|
||||
<span id="cb11-8"><a href="#cb11-8" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb11-9"><a href="#cb11-9" aria-hidden="true" tabindex="-1"></a><span class="fu">lm_eval_batch_size</span><span class="kw">:</span><span class="co"> # Batch size for evaluation</span></span>
|
||||
<span id="cb11-10"><a href="#cb11-10" aria-hidden="true" tabindex="-1"></a><span class="fu">output_dir</span><span class="kw">:</span><span class="co"> # Directory to save evaluation results</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</section>
|
||||
<section id="citation-3" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="citation-3">Citation</h3>
|
||||
<div class="sourceCode" id="cb11"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb11-1"><a href="#cb11-1" aria-hidden="true" tabindex="-1"></a><span class="va">@misc</span>{<span class="ot">eval</span>-<span class="ot">harness</span>,</span>
|
||||
<span id="cb11-2"><a href="#cb11-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span> = {Gao, Leo and Tow, Jonathan and Abbasi, Baber and Biderman, Stella and Black, Sid and DiPofi, Anthony and Foster, Charles and Golding, Laurence and Hsu, Jeffrey and Le Noac'h, Alain and Li, Haonan and McDonell, Kyle and Muennighoff, Niklas and Ociepa, Chris and Phang, Jason and Reynolds, Laria and Schoelkopf, Hailey and Skowron, Aviya and Sutawika, Lintang and Tang, Eric and Thite, Anish and Wang, Ben and Wang, Kevin and Zou, Andy},</span>
|
||||
<span id="cb11-3"><a href="#cb11-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span> = {A framework for few-shot language model evaluation},</span>
|
||||
<span id="cb11-4"><a href="#cb11-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">month</span> = 07,</span>
|
||||
<span id="cb11-5"><a href="#cb11-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span> = 2024,</span>
|
||||
<span id="cb11-6"><a href="#cb11-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">publisher</span> = {Zenodo},</span>
|
||||
<span id="cb11-7"><a href="#cb11-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">version</span> = {v0.4.3},</span>
|
||||
<span id="cb11-8"><a href="#cb11-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">doi</span> = {10.5281/zenodo.12608602},</span>
|
||||
<span id="cb11-9"><a href="#cb11-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span> = {https://zenodo.org/records/12608602}</span>
|
||||
<span id="cb11-10"><a href="#cb11-10" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb12"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb12-1"><a href="#cb12-1" aria-hidden="true" tabindex="-1"></a><span class="va">@misc</span>{<span class="ot">eval</span>-<span class="ot">harness</span>,</span>
|
||||
<span id="cb12-2"><a href="#cb12-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span> = {Gao, Leo and Tow, Jonathan and Abbasi, Baber and Biderman, Stella and Black, Sid and DiPofi, Anthony and Foster, Charles and Golding, Laurence and Hsu, Jeffrey and Le Noac'h, Alain and Li, Haonan and McDonell, Kyle and Muennighoff, Niklas and Ociepa, Chris and Phang, Jason and Reynolds, Laria and Schoelkopf, Hailey and Skowron, Aviya and Sutawika, Lintang and Tang, Eric and Thite, Anish and Wang, Ben and Wang, Kevin and Zou, Andy},</span>
|
||||
<span id="cb12-3"><a href="#cb12-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span> = {A framework for few-shot language model evaluation},</span>
|
||||
<span id="cb12-4"><a href="#cb12-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">month</span> = 07,</span>
|
||||
<span id="cb12-5"><a href="#cb12-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span> = 2024,</span>
|
||||
<span id="cb12-6"><a href="#cb12-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">publisher</span> = {Zenodo},</span>
|
||||
<span id="cb12-7"><a href="#cb12-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">version</span> = {v0.4.3},</span>
|
||||
<span id="cb12-8"><a href="#cb12-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">doi</span> = {10.5281/zenodo.12608602},</span>
|
||||
<span id="cb12-9"><a href="#cb12-9" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span> = {https://zenodo.org/records/12608602}</span>
|
||||
<span id="cb12-10"><a href="#cb12-10" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/lm_eval">here</a></p>
|
||||
</section>
|
||||
</section>
|
||||
@@ -752,23 +756,23 @@ By identifying the top n% of layers with the highest SNR, you can optimize train
|
||||
</section>
|
||||
<section id="usage-5" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-5">Usage</h3>
|
||||
<div class="sourceCode" id="cb12"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb12-1"><a href="#cb12-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb12-2"><a href="#cb12-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.spectrum.SpectrumPlugin</span></span>
|
||||
<span id="cb12-3"><a href="#cb12-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb12-4"><a href="#cb12-4" aria-hidden="true" tabindex="-1"></a><span class="fu">spectrum_top_fraction</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.5</span></span>
|
||||
<span id="cb12-5"><a href="#cb12-5" aria-hidden="true" tabindex="-1"></a><span class="fu">spectrum_model_name</span><span class="kw">:</span><span class="at"> meta-llama/Meta-Llama-3.1-8B</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb13"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb13-1"><a href="#cb13-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb13-2"><a href="#cb13-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.spectrum.SpectrumPlugin</span></span>
|
||||
<span id="cb13-3"><a href="#cb13-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb13-4"><a href="#cb13-4" aria-hidden="true" tabindex="-1"></a><span class="fu">spectrum_top_fraction</span><span class="kw">:</span><span class="at"> </span><span class="fl">0.5</span></span>
|
||||
<span id="cb13-5"><a href="#cb13-5" aria-hidden="true" tabindex="-1"></a><span class="fu">spectrum_model_name</span><span class="kw">:</span><span class="at"> meta-llama/Meta-Llama-3.1-8B</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</section>
|
||||
<section id="citation-4" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="citation-4">Citation</h3>
|
||||
<div class="sourceCode" id="cb13"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb13-1"><a href="#cb13-1" aria-hidden="true" tabindex="-1"></a><span class="va">@misc</span>{<span class="ot">hartford2024spectrumtargetedtrainingsignal</span>,</span>
|
||||
<span id="cb13-2"><a href="#cb13-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={Spectrum: Targeted Training on Signal to Noise Ratio},</span>
|
||||
<span id="cb13-3"><a href="#cb13-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Eric Hartford and Lucas Atkins and Fernando Fernandes Neto and David Golchinfar},</span>
|
||||
<span id="cb13-4"><a href="#cb13-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024},</span>
|
||||
<span id="cb13-5"><a href="#cb13-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">eprint</span>={2406.06623},</span>
|
||||
<span id="cb13-6"><a href="#cb13-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">archivePrefix</span>={arXiv},</span>
|
||||
<span id="cb13-7"><a href="#cb13-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">primaryClass</span>={cs.LG},</span>
|
||||
<span id="cb13-8"><a href="#cb13-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span>={https://arxiv.org/abs/2406.06623},</span>
|
||||
<span id="cb13-9"><a href="#cb13-9" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb14"><pre class="sourceCode bib code-with-copy"><code class="sourceCode bibtex"><span id="cb14-1"><a href="#cb14-1" aria-hidden="true" tabindex="-1"></a><span class="va">@misc</span>{<span class="ot">hartford2024spectrumtargetedtrainingsignal</span>,</span>
|
||||
<span id="cb14-2"><a href="#cb14-2" aria-hidden="true" tabindex="-1"></a> <span class="dt">title</span>={Spectrum: Targeted Training on Signal to Noise Ratio},</span>
|
||||
<span id="cb14-3"><a href="#cb14-3" aria-hidden="true" tabindex="-1"></a> <span class="dt">author</span>={Eric Hartford and Lucas Atkins and Fernando Fernandes Neto and David Golchinfar},</span>
|
||||
<span id="cb14-4"><a href="#cb14-4" aria-hidden="true" tabindex="-1"></a> <span class="dt">year</span>={2024},</span>
|
||||
<span id="cb14-5"><a href="#cb14-5" aria-hidden="true" tabindex="-1"></a> <span class="dt">eprint</span>={2406.06623},</span>
|
||||
<span id="cb14-6"><a href="#cb14-6" aria-hidden="true" tabindex="-1"></a> <span class="dt">archivePrefix</span>={arXiv},</span>
|
||||
<span id="cb14-7"><a href="#cb14-7" aria-hidden="true" tabindex="-1"></a> <span class="dt">primaryClass</span>={cs.LG},</span>
|
||||
<span id="cb14-8"><a href="#cb14-8" aria-hidden="true" tabindex="-1"></a> <span class="dt">url</span>={https://arxiv.org/abs/2406.06623},</span>
|
||||
<span id="cb14-9"><a href="#cb14-9" aria-hidden="true" tabindex="-1"></a>}</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>Please see reference <a href="https://github.com/axolotl-ai-cloud/axolotl/tree/main/src/axolotl/integrations/spectrum">here</a></p>
|
||||
</section>
|
||||
</section>
|
||||
@@ -782,7 +786,7 @@ By identifying the top n% of layers with the highest SNR, you can optimize train
|
||||
<h3 class="anchored" data-anchor-id="requirements-1">Requirements</h3>
|
||||
<ul>
|
||||
<li><p>Axolotl with <code>llmcompressor</code> extras:</p>
|
||||
<div class="sourceCode" id="cb14"><pre class="sourceCode bash code-with-copy"><code class="sourceCode bash"><span id="cb14-1"><a href="#cb14-1" aria-hidden="true" tabindex="-1"></a><span class="ex">pip</span> install <span class="st">"axolotl[llmcompressor]"</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div></li>
|
||||
<div class="sourceCode" id="cb15"><pre class="sourceCode bash code-with-copy"><code class="sourceCode bash"><span id="cb15-1"><a href="#cb15-1" aria-hidden="true" tabindex="-1"></a><span class="ex">pip</span> install <span class="st">"axolotl[llmcompressor]"</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div></li>
|
||||
<li><p>Requires <code>llmcompressor >= 0.5.1</code></p></li>
|
||||
</ul>
|
||||
<p>This will install all necessary dependencies to fine-tune sparsified models using the integration.</p>
|
||||
@@ -791,25 +795,25 @@ By identifying the top n% of layers with the highest SNR, you can optimize train
|
||||
<section id="usage-6" class="level3">
|
||||
<h3 class="anchored" data-anchor-id="usage-6">Usage</h3>
|
||||
<p>To enable sparse fine-tuning with this integration, include the plugin in your Axolotl config:</p>
|
||||
<div class="sourceCode" id="cb15"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb15-1"><a href="#cb15-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb15-2"><a href="#cb15-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.llm_compressor.LLMCompressorPlugin</span></span>
|
||||
<span id="cb15-3"><a href="#cb15-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb15-4"><a href="#cb15-4" aria-hidden="true" tabindex="-1"></a><span class="fu">llmcompressor</span><span class="kw">:</span></span>
|
||||
<span id="cb15-5"><a href="#cb15-5" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">recipe</span><span class="kw">:</span></span>
|
||||
<span id="cb15-6"><a href="#cb15-6" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">finetuning_stage</span><span class="kw">:</span></span>
|
||||
<span id="cb15-7"><a href="#cb15-7" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">finetuning_modifiers</span><span class="kw">:</span></span>
|
||||
<span id="cb15-8"><a href="#cb15-8" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">ConstantPruningModifier</span><span class="kw">:</span></span>
|
||||
<span id="cb15-9"><a href="#cb15-9" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">targets</span><span class="kw">:</span><span class="at"> </span><span class="kw">[</span></span>
|
||||
<span id="cb15-10"><a href="#cb15-10" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*q_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-11"><a href="#cb15-11" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*k_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-12"><a href="#cb15-12" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*v_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-13"><a href="#cb15-13" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*o_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-14"><a href="#cb15-14" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*gate_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-15"><a href="#cb15-15" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*up_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-16"><a href="#cb15-16" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*down_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb15-17"><a href="#cb15-17" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">]</span></span>
|
||||
<span id="cb15-18"><a href="#cb15-18" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">start</span><span class="kw">:</span><span class="at"> </span><span class="dv">0</span></span>
|
||||
<span id="cb15-19"><a href="#cb15-19" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">save_compressed</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb16"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb16-1"><a href="#cb16-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb16-2"><a href="#cb16-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.llm_compressor.LLMCompressorPlugin</span></span>
|
||||
<span id="cb16-3"><a href="#cb16-3" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb16-4"><a href="#cb16-4" aria-hidden="true" tabindex="-1"></a><span class="fu">llmcompressor</span><span class="kw">:</span></span>
|
||||
<span id="cb16-5"><a href="#cb16-5" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">recipe</span><span class="kw">:</span></span>
|
||||
<span id="cb16-6"><a href="#cb16-6" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">finetuning_stage</span><span class="kw">:</span></span>
|
||||
<span id="cb16-7"><a href="#cb16-7" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">finetuning_modifiers</span><span class="kw">:</span></span>
|
||||
<span id="cb16-8"><a href="#cb16-8" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">ConstantPruningModifier</span><span class="kw">:</span></span>
|
||||
<span id="cb16-9"><a href="#cb16-9" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">targets</span><span class="kw">:</span><span class="at"> </span><span class="kw">[</span></span>
|
||||
<span id="cb16-10"><a href="#cb16-10" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*q_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-11"><a href="#cb16-11" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*k_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-12"><a href="#cb16-12" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*v_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-13"><a href="#cb16-13" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*o_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-14"><a href="#cb16-14" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*gate_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-15"><a href="#cb16-15" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*up_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-16"><a href="#cb16-16" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="st">'re:.*down_proj.weight'</span><span class="kw">,</span></span>
|
||||
<span id="cb16-17"><a href="#cb16-17" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">]</span></span>
|
||||
<span id="cb16-18"><a href="#cb16-18" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">start</span><span class="kw">:</span><span class="at"> </span><span class="dv">0</span></span>
|
||||
<span id="cb16-19"><a href="#cb16-19" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="fu">save_compressed</span><span class="kw">:</span><span class="at"> </span><span class="ch">true</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>This plugin <strong>does not apply pruning or sparsification itself</strong> — it is intended for <strong>fine-tuning models that have already been sparsified</strong>.</p>
|
||||
<p>Pre-sparsified checkpoints can be:
|
||||
- Generated using <a href="https://github.com/vllm-project/llm-compressor">LLMCompressor</a>
|
||||
@@ -836,22 +840,22 @@ By identifying the top n% of layers with the highest SNR, you can optimize train
|
||||
<p>After fine-tuning your sparse model, you can leverage vLLM for efficient inference.
|
||||
You can also use LLMCompressor to apply additional quantization to your fine-tuned
|
||||
sparse model before inference for even greater performance benefits.:</p>
|
||||
<div class="sourceCode" id="cb16"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb16-1"><a href="#cb16-1" aria-hidden="true" tabindex="-1"></a><span class="im">from</span> vllm <span class="im">import</span> LLM, SamplingParams</span>
|
||||
<span id="cb16-2"><a href="#cb16-2" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb16-3"><a href="#cb16-3" aria-hidden="true" tabindex="-1"></a>prompts <span class="op">=</span> [</span>
|
||||
<span id="cb16-4"><a href="#cb16-4" aria-hidden="true" tabindex="-1"></a> <span class="st">"Hello, my name is"</span>,</span>
|
||||
<span id="cb16-5"><a href="#cb16-5" aria-hidden="true" tabindex="-1"></a> <span class="st">"The president of the United States is"</span>,</span>
|
||||
<span id="cb16-6"><a href="#cb16-6" aria-hidden="true" tabindex="-1"></a> <span class="st">"The capital of France is"</span>,</span>
|
||||
<span id="cb16-7"><a href="#cb16-7" aria-hidden="true" tabindex="-1"></a> <span class="st">"The future of AI is"</span>,</span>
|
||||
<span id="cb16-8"><a href="#cb16-8" aria-hidden="true" tabindex="-1"></a>]</span>
|
||||
<span id="cb16-9"><a href="#cb16-9" aria-hidden="true" tabindex="-1"></a>sampling_params <span class="op">=</span> SamplingParams(temperature<span class="op">=</span><span class="fl">0.8</span>, top_p<span class="op">=</span><span class="fl">0.95</span>)</span>
|
||||
<span id="cb16-10"><a href="#cb16-10" aria-hidden="true" tabindex="-1"></a>llm <span class="op">=</span> LLM(<span class="st">"path/to/your/sparse/model"</span>)</span>
|
||||
<span id="cb16-11"><a href="#cb16-11" aria-hidden="true" tabindex="-1"></a>outputs <span class="op">=</span> llm.generate(prompts, sampling_params)</span>
|
||||
<span id="cb16-12"><a href="#cb16-12" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb16-13"><a href="#cb16-13" aria-hidden="true" tabindex="-1"></a><span class="cf">for</span> output <span class="kw">in</span> outputs:</span>
|
||||
<span id="cb16-14"><a href="#cb16-14" aria-hidden="true" tabindex="-1"></a> prompt <span class="op">=</span> output.prompt</span>
|
||||
<span id="cb16-15"><a href="#cb16-15" aria-hidden="true" tabindex="-1"></a> generated_text <span class="op">=</span> output.outputs[<span class="dv">0</span>].text</span>
|
||||
<span id="cb16-16"><a href="#cb16-16" aria-hidden="true" tabindex="-1"></a> <span class="bu">print</span>(<span class="ss">f"Prompt: </span><span class="sc">{</span>prompt<span class="sc">!r}</span><span class="ss">, Generated text: </span><span class="sc">{</span>generated_text<span class="sc">!r}</span><span class="ss">"</span>)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb17"><pre class="sourceCode python code-with-copy"><code class="sourceCode python"><span id="cb17-1"><a href="#cb17-1" aria-hidden="true" tabindex="-1"></a><span class="im">from</span> vllm <span class="im">import</span> LLM, SamplingParams</span>
|
||||
<span id="cb17-2"><a href="#cb17-2" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb17-3"><a href="#cb17-3" aria-hidden="true" tabindex="-1"></a>prompts <span class="op">=</span> [</span>
|
||||
<span id="cb17-4"><a href="#cb17-4" aria-hidden="true" tabindex="-1"></a> <span class="st">"Hello, my name is"</span>,</span>
|
||||
<span id="cb17-5"><a href="#cb17-5" aria-hidden="true" tabindex="-1"></a> <span class="st">"The president of the United States is"</span>,</span>
|
||||
<span id="cb17-6"><a href="#cb17-6" aria-hidden="true" tabindex="-1"></a> <span class="st">"The capital of France is"</span>,</span>
|
||||
<span id="cb17-7"><a href="#cb17-7" aria-hidden="true" tabindex="-1"></a> <span class="st">"The future of AI is"</span>,</span>
|
||||
<span id="cb17-8"><a href="#cb17-8" aria-hidden="true" tabindex="-1"></a>]</span>
|
||||
<span id="cb17-9"><a href="#cb17-9" aria-hidden="true" tabindex="-1"></a>sampling_params <span class="op">=</span> SamplingParams(temperature<span class="op">=</span><span class="fl">0.8</span>, top_p<span class="op">=</span><span class="fl">0.95</span>)</span>
|
||||
<span id="cb17-10"><a href="#cb17-10" aria-hidden="true" tabindex="-1"></a>llm <span class="op">=</span> LLM(<span class="st">"path/to/your/sparse/model"</span>)</span>
|
||||
<span id="cb17-11"><a href="#cb17-11" aria-hidden="true" tabindex="-1"></a>outputs <span class="op">=</span> llm.generate(prompts, sampling_params)</span>
|
||||
<span id="cb17-12"><a href="#cb17-12" aria-hidden="true" tabindex="-1"></a></span>
|
||||
<span id="cb17-13"><a href="#cb17-13" aria-hidden="true" tabindex="-1"></a><span class="cf">for</span> output <span class="kw">in</span> outputs:</span>
|
||||
<span id="cb17-14"><a href="#cb17-14" aria-hidden="true" tabindex="-1"></a> prompt <span class="op">=</span> output.prompt</span>
|
||||
<span id="cb17-15"><a href="#cb17-15" aria-hidden="true" tabindex="-1"></a> generated_text <span class="op">=</span> output.outputs[<span class="dv">0</span>].text</span>
|
||||
<span id="cb17-16"><a href="#cb17-16" aria-hidden="true" tabindex="-1"></a> <span class="bu">print</span>(<span class="ss">f"Prompt: </span><span class="sc">{</span>prompt<span class="sc">!r}</span><span class="ss">, Generated text: </span><span class="sc">{</span>generated_text<span class="sc">!r}</span><span class="ss">"</span>)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>For more details on vLLM’s capabilities and advanced configuration options, see the <a href="https://docs.vllm.ai/">official vLLM documentation</a>.</p>
|
||||
</section>
|
||||
<section id="learn-more" class="level3">
|
||||
@@ -901,10 +905,10 @@ Warning
|
||||
</div>
|
||||
<div class="callout-body-container callout-body">
|
||||
<p>If you could not load your integration, please ensure you are pip installing in editable mode.</p>
|
||||
<div class="sourceCode" id="cb17"><pre class="sourceCode bash code-with-copy"><code class="sourceCode bash"><span id="cb17-1"><a href="#cb17-1" aria-hidden="true" tabindex="-1"></a><span class="ex">pip</span> install <span class="at">-e</span> .</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb18"><pre class="sourceCode bash code-with-copy"><code class="sourceCode bash"><span id="cb18-1"><a href="#cb18-1" aria-hidden="true" tabindex="-1"></a><span class="ex">pip</span> install <span class="at">-e</span> .</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<p>and correctly spelled the integration name in the config file.</p>
|
||||
<div class="sourceCode" id="cb18"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb18-1"><a href="#cb18-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb18-2"><a href="#cb18-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.your_integration_name.YourIntegrationPlugin</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="sourceCode" id="cb19"><pre class="sourceCode yaml code-with-copy"><code class="sourceCode yaml"><span id="cb19-1"><a href="#cb19-1" aria-hidden="true" tabindex="-1"></a><span class="fu">plugins</span><span class="kw">:</span></span>
|
||||
<span id="cb19-2"><a href="#cb19-2" aria-hidden="true" tabindex="-1"></a><span class="at"> </span><span class="kw">-</span><span class="at"> axolotl.integrations.your_integration_name.YourIntegrationPlugin</span></span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="callout callout-style-default callout-note callout-titled">
|
||||
|
||||
@@ -984,7 +984,7 @@ Tip
|
||||
</div>
|
||||
</div>
|
||||
<div class="callout-body-container callout-body">
|
||||
<p>Check out our <a href="https://github.com/axolotl-ai-cloud/axolotl-cookbook/tree/main/grpo#training-an-r1-style-large-language-model-using-grpo">GRPO cookbook</a>.</p>
|
||||
<p>Check out our <a href="https://github.com/axolotl-ai-cloud/grpo_code">GRPO cookbook</a>.</p>
|
||||
</div>
|
||||
</div>
|
||||
<p>In the latest GRPO implementation, <code>vLLM</code> is used to significantly speedup trajectory generation during training. In this example, we’re using 4 GPUs - 2 for training, and 2 for vLLM:</p>
|
||||
|
||||
@@ -455,7 +455,7 @@
|
||||
"href": "docs/custom_integrations.html#cut-cross-entropy",
|
||||
"title": "Custom Integrations",
|
||||
"section": "Cut Cross Entropy",
|
||||
"text": "Cut Cross Entropy\nCut Cross Entropy (CCE) reduces VRAM usage through optimization on the cross-entropy operation during loss calculation.\nSee https://github.com/apple/ml-cross-entropy\n\nRequirements\n\nPyTorch 2.4.0 or higher\n\n\n\nInstallation\nRun the following command to install cut_cross_entropy[transformers] if you don’t have it already.\n\nIf you are in dev environment\n\npython scripts/cutcrossentropy_install.py | sh\n\nIf you are installing from pip\n\npip3 uninstall -y cut-cross-entropy && pip3 install \"cut-cross-entropy[transformers] @ git+https://github.com/apple/ml-cross-entropy.git@bad6f7b49c75fdec69471abb71b4cddd0f0c6438\"\n\n\nUsage\nplugins:\n - axolotl.integrations.cut_cross_entropy.CutCrossEntropyPlugin\n\n\nSupported Models\n\nllama\nllama4\nllama4_text\nmllama\nphi3\ngemma\ngemma2\ngemma3\ngemma3_text\nmistral\nmistral3\nqwen2\nqwen2_moe\nqwen2_vl\nqwen2_5_vl\nqwen3\nqwen3_moe\ncohere\ncohere2\nglm\nglm4\n\n\n\nCitation\n@article{wijmans2024cut,\n author = {Erik Wijmans and\n Brody Huval and\n Alexander Hertzberg and\n Vladlen Koltun and\n Philipp Kr\\\"ahenb\\\"uhl},\n title = {Cut Your Losses in Large-Vocabulary Language Models},\n journal = {arXiv},\n year = {2024},\n url = {https://arxiv.org/abs/2411.09009},\n}\nPlease see reference here",
|
||||
"text": "Cut Cross Entropy\nCut Cross Entropy (CCE) reduces VRAM usage through optimization on the cross-entropy operation during loss calculation.\nSee https://github.com/apple/ml-cross-entropy\n\nRequirements\n\nPyTorch 2.4.0 or higher\n\n\n\nInstallation\nRun the following command to install cut_cross_entropy[transformers] if you don’t have it already.\n\nIf you are in dev environment\n\npython scripts/cutcrossentropy_install.py | sh\n\nIf you are installing from pip\n\npip3 uninstall -y cut-cross-entropy && pip3 install \"cut-cross-entropy[transformers] @ git+https://github.com/apple/ml-cross-entropy.git@bad6f7b49c75fdec69471abb71b4cddd0f0c6438\"\n\n\nUsage\nNOTE: If you are training a VLM model, please use older version of Axolotl as upstream has applied a major VLM refactor, and our patches have not been updated yet.\ngit checkout 787880215b3ab32ccaf81c1b2e9588c6f3e6e764\n\npip3 install --no-build-isolation -e .\nplugins:\n - axolotl.integrations.cut_cross_entropy.CutCrossEntropyPlugin\n\n\nSupported Models\n\nllama\nllama4\nllama4_text\nmllama\nphi3\ngemma\ngemma2\ngemma3\ngemma3_text\nmistral\nmistral3\nqwen2\nqwen2_moe\nqwen2_vl\nqwen2_5_vl\nqwen3\nqwen3_moe\ncohere\ncohere2\nglm\nglm4\n\n\n\nCitation\n@article{wijmans2024cut,\n author = {Erik Wijmans and\n Brody Huval and\n Alexander Hertzberg and\n Vladlen Koltun and\n Philipp Kr\\\"ahenb\\\"uhl},\n title = {Cut Your Losses in Large-Vocabulary Language Models},\n journal = {arXiv},\n year = {2024},\n url = {https://arxiv.org/abs/2411.09009},\n}\nPlease see reference here",
|
||||
"crumbs": [
|
||||
"Advanced Features",
|
||||
"Custom Integrations"
|
||||
@@ -1407,7 +1407,7 @@
|
||||
"href": "docs/api/monkeypatch.lora_kernels.html",
|
||||
"title": "monkeypatch.lora_kernels",
|
||||
"section": "",
|
||||
"text": "monkeypatch.lora_kernels\nModule for patching custom LoRA Triton kernels and torch.autograd functions.\n\n\n\n\n\nName\nDescription\n\n\n\n\nFakeMLP\nplaceholder MLP for triton patching\n\n\n\n\n\nmonkeypatch.lora_kernels.FakeMLP(gate_proj, up_proj, down_proj)\nplaceholder MLP for triton patching\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\napply_lora_kernel_patches\nApplies optimized Triton kernel patches to a PEFT model.\n\n\nget_attention_cls_from_config\nGet the appropriate attention class by inspecting the model config.\n\n\noriginal_apply_o\nOriginal implementation of output projection without optimizations.\n\n\noriginal_apply_qkv\nOriginal implementation of QKV projection without optimizations.\n\n\npatch_self_attn_lora\nGiven an axolotl config, this method patches the inferred attention class forward\n\n\n\n\n\nmonkeypatch.lora_kernels.apply_lora_kernel_patches(model, cfg)\nApplies optimized Triton kernel patches to a PEFT model.\nPatches a PEFT model with optimized implementations for MLP and attention\ncomputations. The optimizations include custom Triton kernels for activation\nfunctions and specialized autograd functions for LoRA computations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model to be patched with optimized kernels.\nrequired\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nPeftModelForCausalLM\nPeftModelForCausalLM\nThe patched model with optimized kernels.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTypeError\nIf the provided model is not a PeftModelForCausalLM.\n\n\n\nNotImplementedError\nIf the model type is not supported.\n\n\n\nAssertionError\nIf multiple adapters are active (currently unsupported).\n\n\n\n\n\n\nThe optimizations require LoRA adapters with no dropout and no bias terms. The\nfunction will skip patching if these conditions aren’t met.\n\n\n\n\nmonkeypatch.lora_kernels.get_attention_cls_from_config(cfg)\nGet the appropriate attention class by inspecting the model config.\nUses dynamic import to support any model architecture that follows\nthe standard transformers naming convention.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nType[nn.Module]\nThe appropriate attention class for the model.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nValueError\nIf base_model not specified or attention class cannot be imported\n\n\n\nImportError\nIf the model module or attention class doesn’t exist\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_o(self, hidden_states)\nOriginal implementation of output projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim]`.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nThe output projection result.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_qkv(self, hidden_states)\nOriginal implementation of QKV projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim].\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, torch.Tensor, torch.Tensor]\nA tuple (query_states, key_states, value_states) containing the projected states for query, key, and value.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.patch_self_attn_lora(cfg)\nGiven an axolotl config, this method patches the inferred attention class forward\npass with optimized LoRA implementations.\nIt modifies the attention class to use optimized QKV and output projections. The\noriginal implementation is preserved and can be restored if needed.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nAssertionError\nIf the required code blocks are not found in the attention implementation."
|
||||
"text": "monkeypatch.lora_kernels\nModule for patching custom LoRA Triton kernels and torch.autograd functions.\n\n\n\n\n\nName\nDescription\n\n\n\n\nFakeMLP\nplaceholder MLP for triton patching\n\n\n\n\n\nmonkeypatch.lora_kernels.FakeMLP(gate_proj, up_proj, down_proj)\nplaceholder MLP for triton patching\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\napply_lora_kernel_patches\nApplies optimized Triton kernel patches to a PEFT model.\n\n\nget_attention_cls_from_config\nGet the appropriate attention class by inspecting the model config.\n\n\nget_layers\nGet the layers of the model. Handles text-only and multimodal models.\n\n\noriginal_apply_o\nOriginal implementation of output projection without optimizations.\n\n\noriginal_apply_qkv\nOriginal implementation of QKV projection without optimizations.\n\n\npatch_self_attn_lora\nGiven an axolotl config, this method patches the inferred attention class forward\n\n\n\n\n\nmonkeypatch.lora_kernels.apply_lora_kernel_patches(model, cfg)\nApplies optimized Triton kernel patches to a PEFT model.\nPatches a PEFT model with optimized implementations for MLP and attention\ncomputations. The optimizations include custom Triton kernels for activation\nfunctions and specialized autograd functions for LoRA computations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model to be patched with optimized kernels.\nrequired\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nPeftModelForCausalLM\nPeftModelForCausalLM\nThe patched model with optimized kernels.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTypeError\nIf the provided model is not a PeftModelForCausalLM.\n\n\n\nNotImplementedError\nIf the model type is not supported.\n\n\n\nAssertionError\nIf multiple adapters are active (currently unsupported).\n\n\n\n\n\n\nThe optimizations require LoRA adapters with no dropout and no bias terms. The\nfunction will skip patching if these conditions aren’t met.\n\n\n\n\nmonkeypatch.lora_kernels.get_attention_cls_from_config(cfg)\nGet the appropriate attention class by inspecting the model config.\nUses dynamic import to support any model architecture that follows\nthe standard transformers naming convention.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nType[nn.Module]\nThe appropriate attention class for the model.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nValueError\nIf base_model not specified or attention class cannot be imported\n\n\n\nImportError\nIf the model module or attention class doesn’t exist\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.get_layers(model)\nGet the layers of the model. Handles text-only and multimodal models.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[nn.Module]\nA list of layers.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_o(self, hidden_states)\nOriginal implementation of output projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim]`.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nThe output projection result.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_qkv(self, hidden_states)\nOriginal implementation of QKV projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim].\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, torch.Tensor, torch.Tensor]\nA tuple (query_states, key_states, value_states) containing the projected states for query, key, and value.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.patch_self_attn_lora(cfg)\nGiven an axolotl config, this method patches the inferred attention class forward\npass with optimized LoRA implementations.\nIt modifies the attention class to use optimized QKV and output projections. The\noriginal implementation is preserved and can be restored if needed.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nAssertionError\nIf the required code blocks are not found in the attention implementation."
|
||||
},
|
||||
{
|
||||
"objectID": "docs/api/monkeypatch.lora_kernels.html#classes",
|
||||
@@ -1421,7 +1421,7 @@
|
||||
"href": "docs/api/monkeypatch.lora_kernels.html#functions",
|
||||
"title": "monkeypatch.lora_kernels",
|
||||
"section": "",
|
||||
"text": "Name\nDescription\n\n\n\n\napply_lora_kernel_patches\nApplies optimized Triton kernel patches to a PEFT model.\n\n\nget_attention_cls_from_config\nGet the appropriate attention class by inspecting the model config.\n\n\noriginal_apply_o\nOriginal implementation of output projection without optimizations.\n\n\noriginal_apply_qkv\nOriginal implementation of QKV projection without optimizations.\n\n\npatch_self_attn_lora\nGiven an axolotl config, this method patches the inferred attention class forward\n\n\n\n\n\nmonkeypatch.lora_kernels.apply_lora_kernel_patches(model, cfg)\nApplies optimized Triton kernel patches to a PEFT model.\nPatches a PEFT model with optimized implementations for MLP and attention\ncomputations. The optimizations include custom Triton kernels for activation\nfunctions and specialized autograd functions for LoRA computations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model to be patched with optimized kernels.\nrequired\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nPeftModelForCausalLM\nPeftModelForCausalLM\nThe patched model with optimized kernels.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTypeError\nIf the provided model is not a PeftModelForCausalLM.\n\n\n\nNotImplementedError\nIf the model type is not supported.\n\n\n\nAssertionError\nIf multiple adapters are active (currently unsupported).\n\n\n\n\n\n\nThe optimizations require LoRA adapters with no dropout and no bias terms. The\nfunction will skip patching if these conditions aren’t met.\n\n\n\n\nmonkeypatch.lora_kernels.get_attention_cls_from_config(cfg)\nGet the appropriate attention class by inspecting the model config.\nUses dynamic import to support any model architecture that follows\nthe standard transformers naming convention.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nType[nn.Module]\nThe appropriate attention class for the model.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nValueError\nIf base_model not specified or attention class cannot be imported\n\n\n\nImportError\nIf the model module or attention class doesn’t exist\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_o(self, hidden_states)\nOriginal implementation of output projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim]`.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nThe output projection result.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_qkv(self, hidden_states)\nOriginal implementation of QKV projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim].\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, torch.Tensor, torch.Tensor]\nA tuple (query_states, key_states, value_states) containing the projected states for query, key, and value.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.patch_self_attn_lora(cfg)\nGiven an axolotl config, this method patches the inferred attention class forward\npass with optimized LoRA implementations.\nIt modifies the attention class to use optimized QKV and output projections. The\noriginal implementation is preserved and can be restored if needed.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nAssertionError\nIf the required code blocks are not found in the attention implementation."
|
||||
"text": "Name\nDescription\n\n\n\n\napply_lora_kernel_patches\nApplies optimized Triton kernel patches to a PEFT model.\n\n\nget_attention_cls_from_config\nGet the appropriate attention class by inspecting the model config.\n\n\nget_layers\nGet the layers of the model. Handles text-only and multimodal models.\n\n\noriginal_apply_o\nOriginal implementation of output projection without optimizations.\n\n\noriginal_apply_qkv\nOriginal implementation of QKV projection without optimizations.\n\n\npatch_self_attn_lora\nGiven an axolotl config, this method patches the inferred attention class forward\n\n\n\n\n\nmonkeypatch.lora_kernels.apply_lora_kernel_patches(model, cfg)\nApplies optimized Triton kernel patches to a PEFT model.\nPatches a PEFT model with optimized implementations for MLP and attention\ncomputations. The optimizations include custom Triton kernels for activation\nfunctions and specialized autograd functions for LoRA computations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model to be patched with optimized kernels.\nrequired\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nPeftModelForCausalLM\nPeftModelForCausalLM\nThe patched model with optimized kernels.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTypeError\nIf the provided model is not a PeftModelForCausalLM.\n\n\n\nNotImplementedError\nIf the model type is not supported.\n\n\n\nAssertionError\nIf multiple adapters are active (currently unsupported).\n\n\n\n\n\n\nThe optimizations require LoRA adapters with no dropout and no bias terms. The\nfunction will skip patching if these conditions aren’t met.\n\n\n\n\nmonkeypatch.lora_kernels.get_attention_cls_from_config(cfg)\nGet the appropriate attention class by inspecting the model config.\nUses dynamic import to support any model architecture that follows\nthe standard transformers naming convention.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nType[nn.Module]\nThe appropriate attention class for the model.\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nValueError\nIf base_model not specified or attention class cannot be imported\n\n\n\nImportError\nIf the model module or attention class doesn’t exist\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.get_layers(model)\nGet the layers of the model. Handles text-only and multimodal models.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodel\nPeftModelForCausalLM\nA PEFT model.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[nn.Module]\nA list of layers.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_o(self, hidden_states)\nOriginal implementation of output projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim]`.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nThe output projection result.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.original_apply_qkv(self, hidden_states)\nOriginal implementation of QKV projection without optimizations.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nself\nnn.Module\nThe attention module instance.\nrequired\n\n\nhidden_states\ntorch.Tensor\nInput tensor of shape [batch_size, seq_len, hidden_dim].\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, torch.Tensor, torch.Tensor]\nA tuple (query_states, key_states, value_states) containing the projected states for query, key, and value.\n\n\n\n\n\n\n\nmonkeypatch.lora_kernels.patch_self_attn_lora(cfg)\nGiven an axolotl config, this method patches the inferred attention class forward\npass with optimized LoRA implementations.\nIt modifies the attention class to use optimized QKV and output projections. The\noriginal implementation is preserved and can be restored if needed.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nDictionary mapping axolotl config keys to values.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nAssertionError\nIf the required code blocks are not found in the attention implementation."
|
||||
},
|
||||
{
|
||||
"objectID": "docs/api/integrations.kd.trainer.html",
|
||||
|
||||
378
sitemap.xml
378
sitemap.xml
@@ -2,758 +2,758 @@
|
||||
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/unsloth.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/conversation.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/stepwise_supervised.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/tokenized.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/mac.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/nccl.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/multi-node.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/docker.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/lr_groups.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/inference.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/cli.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/faq.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/getting-started.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/custom_integrations.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/fsdp_qlora.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/common.const.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.851Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.201Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_tokenizers.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.559Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.898Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.user_defined.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.194Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.535Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.training_args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.677Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.015Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.user_defined.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.119Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.460Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.dict.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.585Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.933Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.unsloth_.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.452Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.799Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.collators.mamba.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.890Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.240Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.optimizer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.017Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.358Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.train.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.757Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.096Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.messages.chat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.169Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.510Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.llama3x.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.703Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.041Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.processor.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.337Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.datasets.transforms.chat_builder.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.717Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.056Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mamba.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.942Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.282Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_cpu.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.458Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.805Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/models.mamba.modeling_mamba.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.867Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.217Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.relora.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.946Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.286Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.builders.causal.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.580Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.919Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.chat.messages.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.700Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.038Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.lm_eval.args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.845Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.194Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.quantize.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.911Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.250Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.checks.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.792Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.131Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.llama3.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.203Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.544Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/kernels.lora.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.320Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.661Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.multimodal.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.679Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.028Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.adapter.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.002Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.342Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/index.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.421Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.760Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_patch_multipack.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.435Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.782Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/train.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.482Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.821Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.mixtral.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.455Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.802Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chatml.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.191Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.532Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.grokfast.optimizer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.830Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.180Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.samplers.multipack.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.935Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.285Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_chat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.098Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.438Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_expand_mask.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.400Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.743Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/common.architectures.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.849Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.199Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.707Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.056Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.chat_templates.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.500Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.848Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.comet_.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.953Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.303Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.main.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.748Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.088Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.grpo.trainer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.964Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.303Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.mlflow_.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.950Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.300Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.model.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.987Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.327Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.tokenization.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.491Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.838Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.chatml.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.211Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.552Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.model_shard_quant.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.511Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.858Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.scheduler.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.028Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.368Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.chatml.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.701Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.040Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.passthrough.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.195Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.536Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.orpo.chat_template.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.233Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.574Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.multipack.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.392Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.734Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.base.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.052Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.392Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.grpo.sampler.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.976Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.316Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.collators.batching.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.887Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.237Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.lora_kernels.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.425Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.771Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.kd.trainer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.838Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.188Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.enums.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.702Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.050Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/datasets.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.504Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.842Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.zephyr.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.192Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.533Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.data.batch_dataset_fetcher.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.453Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.800Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.model.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.639Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.988Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.cut_cross_entropy.args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.829Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.179Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.trainer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.539Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.886Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.lisa.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.946Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.296Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.data.pretraining.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.594Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.942Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.profiler.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.945Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.295Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.metharme.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.154Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.495Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.collators.core.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.868Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.218Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.stablelm_attn_hijack_flash.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.441Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.788Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_w_system.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.111Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.452Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.lora.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.505Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.852Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/qat.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/quantize.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/ray-integration.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/sequence_parallelism.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/reward_modelling.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/index.html</loc>
|
||||
<lastmod>2025-06-13T14:00:47.009Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.428Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/src/axolotl/integrations/LICENSE.html</loc>
|
||||
<lastmod>2025-06-13T14:00:47.013Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.432Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/FAQS.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.990Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.410Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/src/axolotl/integrations/cut_cross_entropy/ACKNOWLEDGEMENTS.html</loc>
|
||||
<lastmod>2025-06-13T14:00:47.013Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.433Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/TODO.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.991Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.410Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/examples/colab-notebooks/colab-axolotl-example.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.997Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.416Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/torchao.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/config.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/input_output.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/batch_vs_grad.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.quantization.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.621Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.970Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.bench.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.514Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.862Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.tokenizer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.995Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.335Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.freeze.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.522Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.869Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.orcamini.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.158Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.499Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.training.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.645Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.993Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.spectrum.args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.848Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.198Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.ctx_managers.sequence_parallel.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.050Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.391Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.inference.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.823Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.163Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/logging_config.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.569Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.908Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.constants.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.011Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.352Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chat_template.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.170Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.511Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.784Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.124Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.trl.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.674Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.023Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.pygmalion.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.165Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.506Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/convert.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.517Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.856Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.base.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.921Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.260Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.preprocess.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.852Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.191Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.config.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.809Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.148Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.relora.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.399Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.741Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.shared.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.704Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.043Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.dpo.trainer.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.953Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.293Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.qat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.960Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.310Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.optimizers.adopt.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.593Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.941Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.evaluate.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.766Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.105Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.trl.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.937Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.277Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.builders.base.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.575Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.914Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.trainer_fsdp_optim.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.444Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.791Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_disk.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.484Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.831Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.distributed.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.582Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.930Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.input_output.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.143Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.484Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.config.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.633Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.981Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.889Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.229Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.perplexity.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.941Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.291Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.integrations.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.691Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.040Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/loaders.patch_manager.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.010Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.350Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.433Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.779Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.vllm_serve.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.896Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.236Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.mistral_attn_hijack_flash.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.391Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.733Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.liger.args.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.841Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.191Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_instruct.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.099Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.440Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.data.sft.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.601Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.949Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.collators.mm_chat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.895Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.245Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.llama2_chat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.131Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.472Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.transformers_fa_utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.451Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.797Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.merge_sharded_fsdp_weights.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.843Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.183Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.merge_lora.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.831Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.171Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/integrations.base.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.826Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.175Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.rng_state_loader.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.020Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.361Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_flash.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.375Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.717Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/kernels.quantize.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.348Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.690Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/evaluate.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.493Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.831Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.builders.rl.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.587Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:45.927Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.datasets.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.662Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.011Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/common.datasets.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.866Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.216Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/kernels.utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.349Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.691Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.completion.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.137Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.478Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.bradley_terry.llama3.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.237Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.578Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.stepwise_supervised.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.148Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.488Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/kernels.swiglu.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.340Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.682Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.cloud.base.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.900Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.239Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.user_defined.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.212Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.554Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.chat_template.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.084Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.425Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.btlm_attn_hijack_flash.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.434Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.781Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.peft.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.671Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:47.019Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.datasets.chat.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.709Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.048Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/core.trainers.utils.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.977Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.317Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/kernels.geglu.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.330Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.672Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.cloud.modal_.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.906Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.245Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_xformers.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.376Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.718Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/utils.schedulers.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.563Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.910Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.llama3.html</loc>
|
||||
<lastmod>2025-06-13T14:01:18.180Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.522Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/api/cli.sweeps.html</loc>
|
||||
<lastmod>2025-06-13T14:01:17.858Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:46.197Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/multimodal.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/debugging.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/multi-gpu.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/lora_optims.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/rlhf.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/amd_hpc.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.411Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/installation.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/multipack.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.996Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.415Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset_preprocessing.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset_loading.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/inst_tune.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/template_free.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/index.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.992Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
<url>
|
||||
<loc>https://docs.axolotl.ai/docs/dataset-formats/pretraining.html</loc>
|
||||
<lastmod>2025-06-13T14:00:46.993Z</lastmod>
|
||||
<lastmod>2025-06-14T18:54:17.412Z</lastmod>
|
||||
</url>
|
||||
</urlset>
|
||||
|
||||
Reference in New Issue
Block a user