Built site for gh-pages

This commit is contained in:
Quarto GHA Workflow Runner
2025-06-25 12:39:05 +00:00
parent 26c3e80b1f
commit 7e84479334
4 changed files with 200 additions and 193 deletions

View File

@@ -1 +1 @@
0496a1b7
d5d7dce8

View File

@@ -685,7 +685,8 @@ from the full gradient tensor.</p>
<span id="cb4-4"><a href="#cb4-4" aria-hidden="true" tabindex="-1"></a> gradient_accumulation_steps,</span>
<span id="cb4-5"><a href="#cb4-5" aria-hidden="true" tabindex="-1"></a> ring_attn_func,</span>
<span id="cb4-6"><a href="#cb4-6" aria-hidden="true" tabindex="-1"></a> heads_k_stride,</span>
<span id="cb4-7"><a href="#cb4-7" aria-hidden="true" tabindex="-1"></a>)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
<span id="cb4-7"><a href="#cb4-7" aria-hidden="true" tabindex="-1"></a> gather_outputs,</span>
<span id="cb4-8"><a href="#cb4-8" aria-hidden="true" tabindex="-1"></a>)</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
<p>Context manager for sequence parallelism operations.</p>
<p>This class provides a context that will automatically apply sequence parallelism
during model forward passes using a pre-forward hook, and gather outputs from
@@ -738,6 +739,12 @@ across the sequence parallelism group using a post-forward hook.</p>
<td>Sequence parallelism K head stride size. Passed through to <code>varlen_llama3</code> <code>ring_flash_attn</code> implementation.</td>
<td><em>required</em></td>
</tr>
<tr class="even">
<td>gather_outputs</td>
<td>bool</td>
<td>Whether to gather outputs after model forward pass across the sequence parallel group.</td>
<td><em>required</em></td>
</tr>
</tbody>
</table>
</section>

View File

@@ -1482,14 +1482,14 @@
"href": "docs/api/utils.ctx_managers.sequence_parallel.html",
"title": "utils.ctx_managers.sequence_parallel",
"section": "",
"text": "utils.ctx_managers.sequence_parallel\nModule for Axolotl trainer sequence parallelism manager and utilities\n\n\n\n\n\nName\nDescription\n\n\n\n\nAllGatherWithGrad\nCustom autograd function for all-gather to preserve gradients.\n\n\nSequenceParallelContextManager\nContext manager for sequence parallelism operations.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad()\nCustom autograd function for all-gather to preserve gradients.\n\n\n\n\n\nName\nDescription\n\n\n\n\nbackward\nBackward pass for all-gather operation.\n\n\nforward\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.backward(\n ctx,\n grad_output,\n)\nBackward pass for all-gather operation.\nExtracts the gradient slice corresponding to this ranks original input\nfrom the full gradient tensor.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ngrad_output\ntorch.Tensor\nGradient from subsequent layers with respect to the concatenated output tensor.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, None]\nTuple containing the gradient slice for this ranks input tensor and None for the process group parameter which doesnt require gradients.\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.forward(\n ctx,\n input_tensor,\n group,\n)\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ninput_tensor\ntorch.Tensor\nTensor from model output with sequence dimension.\nrequired\n\n\ngroup\ndist.ProcessGroup\ntorch.distributed process group.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nTensor from gathering the input_tensor from across the process group and concatenating along the sequence dimension.\n\n\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.SequenceParallelContextManager(\n models,\n sequence_parallel_degree,\n gradient_accumulation_steps,\n ring_attn_func,\n heads_k_stride,\n)\nContext manager for sequence parallelism operations.\nThis class provides a context that will automatically apply sequence parallelism\nduring model forward passes using a pre-forward hook, and gather outputs from\nacross the sequence parallelism group using a post-forward hook.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodels\nlist[nn.Module]\nList of models to apply sequence parallelism to pre- and post- forward hooks.\nrequired\n\n\nsequence_parallel_degree\nint\nNumber of processes to split sequences over.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused.\nrequired\n\n\nheads_k_stride\nint | None\nSequence parallelism K head stride size. Passed through to varlen_llama3 ring_flash_attn implementation.\nrequired\n\n\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\napply_sequence_parallelism\nApply sequence parallelism slicing to a batch.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.apply_sequence_parallelism(\n batch,\n local_rank,\n local_world_size,\n gradient_accumulation_steps,\n ring_attn_func,\n)\nApply sequence parallelism slicing to a batch.\nSpecial handling is implemented for integer logits_to_keep, which indicates\nto only keep the last N tokens in the sequence during generation.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nbatch\ndict[str, torch.Tensor]\nBatch dictionary (e.g., input_ids, attention_mask, etc.).\nrequired\n\n\nlocal_rank\nint\nLocal rank in the sequence parallel group.\nrequired\n\n\nlocal_world_size\nint\nWorld size of the sequence parallel group.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused, but related to above TODO.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[dict[str, torch.Tensor], int, int]\ntuple of: - Batch dictionary with sliced tensors. - The original sequence length before padding. - The number of padding tokens added."
"text": "utils.ctx_managers.sequence_parallel\nModule for Axolotl trainer sequence parallelism manager and utilities\n\n\n\n\n\nName\nDescription\n\n\n\n\nAllGatherWithGrad\nCustom autograd function for all-gather to preserve gradients.\n\n\nSequenceParallelContextManager\nContext manager for sequence parallelism operations.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad()\nCustom autograd function for all-gather to preserve gradients.\n\n\n\n\n\nName\nDescription\n\n\n\n\nbackward\nBackward pass for all-gather operation.\n\n\nforward\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.backward(\n ctx,\n grad_output,\n)\nBackward pass for all-gather operation.\nExtracts the gradient slice corresponding to this ranks original input\nfrom the full gradient tensor.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ngrad_output\ntorch.Tensor\nGradient from subsequent layers with respect to the concatenated output tensor.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, None]\nTuple containing the gradient slice for this ranks input tensor and None for the process group parameter which doesnt require gradients.\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.forward(\n ctx,\n input_tensor,\n group,\n)\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ninput_tensor\ntorch.Tensor\nTensor from model output with sequence dimension.\nrequired\n\n\ngroup\ndist.ProcessGroup\ntorch.distributed process group.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nTensor from gathering the input_tensor from across the process group and concatenating along the sequence dimension.\n\n\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.SequenceParallelContextManager(\n models,\n sequence_parallel_degree,\n gradient_accumulation_steps,\n ring_attn_func,\n heads_k_stride,\n gather_outputs,\n)\nContext manager for sequence parallelism operations.\nThis class provides a context that will automatically apply sequence parallelism\nduring model forward passes using a pre-forward hook, and gather outputs from\nacross the sequence parallelism group using a post-forward hook.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodels\nlist[nn.Module]\nList of models to apply sequence parallelism to pre- and post- forward hooks.\nrequired\n\n\nsequence_parallel_degree\nint\nNumber of processes to split sequences over.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused.\nrequired\n\n\nheads_k_stride\nint | None\nSequence parallelism K head stride size. Passed through to varlen_llama3 ring_flash_attn implementation.\nrequired\n\n\ngather_outputs\nbool\nWhether to gather outputs after model forward pass across the sequence parallel group.\nrequired\n\n\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\napply_sequence_parallelism\nApply sequence parallelism slicing to a batch.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.apply_sequence_parallelism(\n batch,\n local_rank,\n local_world_size,\n gradient_accumulation_steps,\n ring_attn_func,\n)\nApply sequence parallelism slicing to a batch.\nSpecial handling is implemented for integer logits_to_keep, which indicates\nto only keep the last N tokens in the sequence during generation.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nbatch\ndict[str, torch.Tensor]\nBatch dictionary (e.g., input_ids, attention_mask, etc.).\nrequired\n\n\nlocal_rank\nint\nLocal rank in the sequence parallel group.\nrequired\n\n\nlocal_world_size\nint\nWorld size of the sequence parallel group.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused, but related to above TODO.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[dict[str, torch.Tensor], int, int]\ntuple of: - Batch dictionary with sliced tensors. - The original sequence length before padding. - The number of padding tokens added."
},
{
"objectID": "docs/api/utils.ctx_managers.sequence_parallel.html#classes",
"href": "docs/api/utils.ctx_managers.sequence_parallel.html#classes",
"title": "utils.ctx_managers.sequence_parallel",
"section": "",
"text": "Name\nDescription\n\n\n\n\nAllGatherWithGrad\nCustom autograd function for all-gather to preserve gradients.\n\n\nSequenceParallelContextManager\nContext manager for sequence parallelism operations.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad()\nCustom autograd function for all-gather to preserve gradients.\n\n\n\n\n\nName\nDescription\n\n\n\n\nbackward\nBackward pass for all-gather operation.\n\n\nforward\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.backward(\n ctx,\n grad_output,\n)\nBackward pass for all-gather operation.\nExtracts the gradient slice corresponding to this ranks original input\nfrom the full gradient tensor.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ngrad_output\ntorch.Tensor\nGradient from subsequent layers with respect to the concatenated output tensor.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, None]\nTuple containing the gradient slice for this ranks input tensor and None for the process group parameter which doesnt require gradients.\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.forward(\n ctx,\n input_tensor,\n group,\n)\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ninput_tensor\ntorch.Tensor\nTensor from model output with sequence dimension.\nrequired\n\n\ngroup\ndist.ProcessGroup\ntorch.distributed process group.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nTensor from gathering the input_tensor from across the process group and concatenating along the sequence dimension.\n\n\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.SequenceParallelContextManager(\n models,\n sequence_parallel_degree,\n gradient_accumulation_steps,\n ring_attn_func,\n heads_k_stride,\n)\nContext manager for sequence parallelism operations.\nThis class provides a context that will automatically apply sequence parallelism\nduring model forward passes using a pre-forward hook, and gather outputs from\nacross the sequence parallelism group using a post-forward hook.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodels\nlist[nn.Module]\nList of models to apply sequence parallelism to pre- and post- forward hooks.\nrequired\n\n\nsequence_parallel_degree\nint\nNumber of processes to split sequences over.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused.\nrequired\n\n\nheads_k_stride\nint | None\nSequence parallelism K head stride size. Passed through to varlen_llama3 ring_flash_attn implementation.\nrequired"
"text": "Name\nDescription\n\n\n\n\nAllGatherWithGrad\nCustom autograd function for all-gather to preserve gradients.\n\n\nSequenceParallelContextManager\nContext manager for sequence parallelism operations.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad()\nCustom autograd function for all-gather to preserve gradients.\n\n\n\n\n\nName\nDescription\n\n\n\n\nbackward\nBackward pass for all-gather operation.\n\n\nforward\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.backward(\n ctx,\n grad_output,\n)\nBackward pass for all-gather operation.\nExtracts the gradient slice corresponding to this ranks original input\nfrom the full gradient tensor.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ngrad_output\ntorch.Tensor\nGradient from subsequent layers with respect to the concatenated output tensor.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntuple[torch.Tensor, None]\nTuple containing the gradient slice for this ranks input tensor and None for the process group parameter which doesnt require gradients.\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.AllGatherWithGrad.forward(\n ctx,\n input_tensor,\n group,\n)\nForward pass of all-gather of data with sequence dimension.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nctx\ntorch.autograd.function.FunctionCtx\ntorch.autograd function context.\nrequired\n\n\ninput_tensor\ntorch.Tensor\nTensor from model output with sequence dimension.\nrequired\n\n\ngroup\ndist.ProcessGroup\ntorch.distributed process group.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntorch.Tensor\nTensor from gathering the input_tensor from across the process group and concatenating along the sequence dimension.\n\n\n\n\n\n\n\n\n\nutils.ctx_managers.sequence_parallel.SequenceParallelContextManager(\n models,\n sequence_parallel_degree,\n gradient_accumulation_steps,\n ring_attn_func,\n heads_k_stride,\n gather_outputs,\n)\nContext manager for sequence parallelism operations.\nThis class provides a context that will automatically apply sequence parallelism\nduring model forward passes using a pre-forward hook, and gather outputs from\nacross the sequence parallelism group using a post-forward hook.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nmodels\nlist[nn.Module]\nList of models to apply sequence parallelism to pre- and post- forward hooks.\nrequired\n\n\nsequence_parallel_degree\nint\nNumber of processes to split sequences over.\nrequired\n\n\ngradient_accumulation_steps\nint\nNumber of steps to accumulate gradients over.\nrequired\n\n\nring_attn_func\nRingAttnFunc\nWhich ring attention function to use. Currently unused.\nrequired\n\n\nheads_k_stride\nint | None\nSequence parallelism K head stride size. Passed through to varlen_llama3 ring_flash_attn implementation.\nrequired\n\n\ngather_outputs\nbool\nWhether to gather outputs after model forward pass across the sequence parallel group.\nrequired"
},
{
"objectID": "docs/api/utils.ctx_managers.sequence_parallel.html#functions",

View File

@@ -2,758 +2,758 @@
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://docs.axolotl.ai/docs/unsloth.html</loc>
<lastmod>2025-06-24T18:59:39.482Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/conversation.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/stepwise_supervised.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/tokenized.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/mac.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/nccl.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/multi-node.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/docker.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/lr_groups.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/inference.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.137Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/cli.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/config-reference.html</loc>
<lastmod>2025-06-24T19:02:59.339Z</lastmod>
<lastmod>2025-06-25T12:37:22.831Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/multi-gpu.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/debugging.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/multimodal.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.sweeps.html</loc>
<lastmod>2025-06-24T19:02:46.206Z</lastmod>
<lastmod>2025-06-25T12:37:08.988Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.llama3.html</loc>
<lastmod>2025-06-24T19:02:46.533Z</lastmod>
<lastmod>2025-06-25T12:37:09.321Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schedulers.html</loc>
<lastmod>2025-06-24T19:02:46.930Z</lastmod>
<lastmod>2025-06-25T12:37:09.717Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_xformers.html</loc>
<lastmod>2025-06-24T19:02:46.733Z</lastmod>
<lastmod>2025-06-25T12:37:09.521Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.cloud.modal_.html</loc>
<lastmod>2025-06-24T19:02:46.254Z</lastmod>
<lastmod>2025-06-25T12:37:09.038Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/kernels.geglu.html</loc>
<lastmod>2025-06-24T19:02:46.687Z</lastmod>
<lastmod>2025-06-25T12:37:09.474Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.utils.html</loc>
<lastmod>2025-06-24T19:02:46.327Z</lastmod>
<lastmod>2025-06-25T12:37:09.111Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.datasets.chat.html</loc>
<lastmod>2025-06-24T19:02:46.057Z</lastmod>
<lastmod>2025-06-25T12:37:08.836Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.peft.html</loc>
<lastmod>2025-06-24T19:02:47.041Z</lastmod>
<lastmod>2025-06-25T12:37:09.830Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.btlm_attn_hijack_flash.html</loc>
<lastmod>2025-06-24T19:02:46.795Z</lastmod>
<lastmod>2025-06-25T12:37:09.585Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.chat_template.html</loc>
<lastmod>2025-06-24T19:02:46.434Z</lastmod>
<lastmod>2025-06-25T12:37:09.223Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.user_defined.html</loc>
<lastmod>2025-06-24T19:02:46.569Z</lastmod>
<lastmod>2025-06-25T12:37:09.353Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.cloud.base.html</loc>
<lastmod>2025-06-24T19:02:46.248Z</lastmod>
<lastmod>2025-06-25T12:37:09.031Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/kernels.swiglu.html</loc>
<lastmod>2025-06-24T19:02:46.697Z</lastmod>
<lastmod>2025-06-25T12:37:09.484Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.stepwise_supervised.html</loc>
<lastmod>2025-06-24T19:02:46.498Z</lastmod>
<lastmod>2025-06-25T12:37:09.287Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.bradley_terry.llama3.html</loc>
<lastmod>2025-06-24T19:02:46.592Z</lastmod>
<lastmod>2025-06-25T12:37:09.377Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.completion.html</loc>
<lastmod>2025-06-24T19:02:46.487Z</lastmod>
<lastmod>2025-06-25T12:37:09.277Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/kernels.utils.html</loc>
<lastmod>2025-06-24T19:02:46.706Z</lastmod>
<lastmod>2025-06-25T12:37:09.493Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/common.datasets.html</loc>
<lastmod>2025-06-24T19:02:47.253Z</lastmod>
<lastmod>2025-06-25T12:37:10.043Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.datasets.html</loc>
<lastmod>2025-06-24T19:02:47.032Z</lastmod>
<lastmod>2025-06-25T12:37:09.821Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.builders.rl.html</loc>
<lastmod>2025-06-24T19:02:46.012Z</lastmod>
<lastmod>2025-06-25T12:37:08.791Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/evaluate.html</loc>
<lastmod>2025-06-24T19:02:45.916Z</lastmod>
<lastmod>2025-06-25T12:37:08.696Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/kernels.quantize.html</loc>
<lastmod>2025-06-24T19:02:46.705Z</lastmod>
<lastmod>2025-06-25T12:37:09.492Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_flash.html</loc>
<lastmod>2025-06-24T19:02:46.731Z</lastmod>
<lastmod>2025-06-25T12:37:09.520Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.rng_state_loader.html</loc>
<lastmod>2025-06-24T19:02:46.370Z</lastmod>
<lastmod>2025-06-25T12:37:09.156Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.base.html</loc>
<lastmod>2025-06-24T19:02:47.213Z</lastmod>
<lastmod>2025-06-25T12:37:10.003Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.merge_lora.html</loc>
<lastmod>2025-06-24T19:02:46.180Z</lastmod>
<lastmod>2025-06-25T12:37:08.961Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.merge_sharded_fsdp_weights.html</loc>
<lastmod>2025-06-24T19:02:46.192Z</lastmod>
<lastmod>2025-06-25T12:37:08.973Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.transformers_fa_utils.html</loc>
<lastmod>2025-06-24T19:02:46.812Z</lastmod>
<lastmod>2025-06-25T12:37:09.602Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.llama2_chat.html</loc>
<lastmod>2025-06-24T19:02:46.481Z</lastmod>
<lastmod>2025-06-25T12:37:09.271Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.collators.mm_chat.html</loc>
<lastmod>2025-06-24T19:02:47.282Z</lastmod>
<lastmod>2025-06-25T12:37:10.072Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.data.sft.html</loc>
<lastmod>2025-06-24T19:02:46.969Z</lastmod>
<lastmod>2025-06-25T12:37:09.756Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_instruct.html</loc>
<lastmod>2025-06-24T19:02:46.449Z</lastmod>
<lastmod>2025-06-25T12:37:09.238Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.liger.args.html</loc>
<lastmod>2025-06-24T19:02:47.228Z</lastmod>
<lastmod>2025-06-25T12:37:10.018Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.mistral_attn_hijack_flash.html</loc>
<lastmod>2025-06-24T19:02:46.747Z</lastmod>
<lastmod>2025-06-25T12:37:09.536Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.vllm_serve.html</loc>
<lastmod>2025-06-24T19:02:46.245Z</lastmod>
<lastmod>2025-06-25T12:37:09.028Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.utils.html</loc>
<lastmod>2025-06-24T19:02:46.793Z</lastmod>
<lastmod>2025-06-25T12:37:09.583Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.patch_manager.html</loc>
<lastmod>2025-06-24T19:02:46.360Z</lastmod>
<lastmod>2025-06-25T12:37:09.146Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.integrations.html</loc>
<lastmod>2025-06-24T19:02:47.061Z</lastmod>
<lastmod>2025-06-25T12:37:09.851Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.perplexity.html</loc>
<lastmod>2025-06-24T19:02:47.329Z</lastmod>
<lastmod>2025-06-25T12:37:10.119Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.utils.html</loc>
<lastmod>2025-06-24T19:02:46.238Z</lastmod>
<lastmod>2025-06-25T12:37:09.021Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.config.html</loc>
<lastmod>2025-06-24T19:02:47.003Z</lastmod>
<lastmod>2025-06-25T12:37:09.790Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.input_output.html</loc>
<lastmod>2025-06-24T19:02:46.493Z</lastmod>
<lastmod>2025-06-25T12:37:09.283Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.distributed.html</loc>
<lastmod>2025-06-24T19:02:46.950Z</lastmod>
<lastmod>2025-06-25T12:37:09.737Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_disk.html</loc>
<lastmod>2025-06-24T19:02:46.848Z</lastmod>
<lastmod>2025-06-25T12:37:09.636Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.trainer_fsdp_optim.html</loc>
<lastmod>2025-06-24T19:02:46.805Z</lastmod>
<lastmod>2025-06-25T12:37:09.595Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.builders.base.html</loc>
<lastmod>2025-06-24T19:02:45.999Z</lastmod>
<lastmod>2025-06-25T12:37:08.778Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.trl.html</loc>
<lastmod>2025-06-24T19:02:46.286Z</lastmod>
<lastmod>2025-06-25T12:37:09.069Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.evaluate.html</loc>
<lastmod>2025-06-24T19:02:46.113Z</lastmod>
<lastmod>2025-06-25T12:37:08.893Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.optimizers.adopt.html</loc>
<lastmod>2025-06-24T19:02:46.961Z</lastmod>
<lastmod>2025-06-25T12:37:09.748Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.qat.html</loc>
<lastmod>2025-06-24T19:02:47.349Z</lastmod>
<lastmod>2025-06-25T12:37:10.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.dpo.trainer.html</loc>
<lastmod>2025-06-24T19:02:46.302Z</lastmod>
<lastmod>2025-06-25T12:37:09.086Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.shared.html</loc>
<lastmod>2025-06-24T19:02:46.052Z</lastmod>
<lastmod>2025-06-25T12:37:08.831Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.relora.html</loc>
<lastmod>2025-06-24T19:02:46.756Z</lastmod>
<lastmod>2025-06-25T12:37:09.545Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.config.html</loc>
<lastmod>2025-06-24T19:02:46.157Z</lastmod>
<lastmod>2025-06-25T12:37:08.938Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.preprocess.html</loc>
<lastmod>2025-06-24T19:02:46.200Z</lastmod>
<lastmod>2025-06-25T12:37:08.982Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.base.html</loc>
<lastmod>2025-06-24T19:02:46.270Z</lastmod>
<lastmod>2025-06-25T12:37:09.053Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/convert.html</loc>
<lastmod>2025-06-24T19:02:45.941Z</lastmod>
<lastmod>2025-06-25T12:37:08.720Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.pygmalion.html</loc>
<lastmod>2025-06-24T19:02:46.515Z</lastmod>
<lastmod>2025-06-25T12:37:09.304Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.trl.html</loc>
<lastmod>2025-06-24T19:02:47.044Z</lastmod>
<lastmod>2025-06-25T12:37:09.833Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.args.html</loc>
<lastmod>2025-06-24T19:02:46.133Z</lastmod>
<lastmod>2025-06-25T12:37:08.913Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chat_template.html</loc>
<lastmod>2025-06-24T19:02:46.521Z</lastmod>
<lastmod>2025-06-25T12:37:09.310Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.constants.html</loc>
<lastmod>2025-06-24T19:02:46.361Z</lastmod>
<lastmod>2025-06-25T12:37:09.147Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/logging_config.html</loc>
<lastmod>2025-06-24T19:02:45.993Z</lastmod>
<lastmod>2025-06-25T12:37:08.772Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.inference.html</loc>
<lastmod>2025-06-24T19:02:46.172Z</lastmod>
<lastmod>2025-06-25T12:37:08.953Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.ctx_managers.sequence_parallel.html</loc>
<lastmod>2025-06-24T19:02:46.400Z</lastmod>
<lastmod>2025-06-25T12:37:09.187Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.spectrum.args.html</loc>
<lastmod>2025-06-24T19:02:47.234Z</lastmod>
<lastmod>2025-06-25T12:37:10.025Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.training.html</loc>
<lastmod>2025-06-24T19:02:47.015Z</lastmod>
<lastmod>2025-06-25T12:37:09.803Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.orcamini.html</loc>
<lastmod>2025-06-24T19:02:46.508Z</lastmod>
<lastmod>2025-06-25T12:37:09.298Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.freeze.html</loc>
<lastmod>2025-06-24T19:02:46.889Z</lastmod>
<lastmod>2025-06-25T12:37:09.674Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.tokenizer.html</loc>
<lastmod>2025-06-24T19:02:46.345Z</lastmod>
<lastmod>2025-06-25T12:37:09.130Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.bench.html</loc>
<lastmod>2025-06-24T19:02:46.881Z</lastmod>
<lastmod>2025-06-25T12:37:09.667Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.quantization.html</loc>
<lastmod>2025-06-24T19:02:46.990Z</lastmod>
<lastmod>2025-06-25T12:37:09.777Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/batch_vs_grad.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/input_output.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/sequence_parallelism.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/reward_modelling.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/index.html</loc>
<lastmod>2025-06-24T18:59:39.494Z</lastmod>
<lastmod>2025-06-25T12:34:07.151Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/src/axolotl/integrations/LICENSE.html</loc>
<lastmod>2025-06-24T18:59:39.498Z</lastmod>
<lastmod>2025-06-25T12:34:07.155Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/FAQS.html</loc>
<lastmod>2025-06-24T18:59:39.476Z</lastmod>
<lastmod>2025-06-25T12:34:07.132Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/src/axolotl/integrations/cut_cross_entropy/ACKNOWLEDGEMENTS.html</loc>
<lastmod>2025-06-24T18:59:39.498Z</lastmod>
<lastmod>2025-06-25T12:34:07.156Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/TODO.html</loc>
<lastmod>2025-06-24T18:59:39.476Z</lastmod>
<lastmod>2025-06-25T12:34:07.133Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/examples/colab-notebooks/colab-axolotl-example.html</loc>
<lastmod>2025-06-24T18:59:39.482Z</lastmod>
<lastmod>2025-06-25T12:34:07.139Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/torchao.html</loc>
<lastmod>2025-06-24T18:59:39.482Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/ray-integration.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/quantize.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/qat.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.lora.html</loc>
<lastmod>2025-06-24T19:02:46.872Z</lastmod>
<lastmod>2025-06-25T12:37:09.658Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_w_system.html</loc>
<lastmod>2025-06-24T19:02:46.461Z</lastmod>
<lastmod>2025-06-25T12:37:09.250Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.stablelm_attn_hijack_flash.html</loc>
<lastmod>2025-06-24T19:02:46.802Z</lastmod>
<lastmod>2025-06-25T12:37:09.592Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.collators.core.html</loc>
<lastmod>2025-06-24T19:02:47.255Z</lastmod>
<lastmod>2025-06-25T12:37:10.045Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.metharme.html</loc>
<lastmod>2025-06-24T19:02:46.505Z</lastmod>
<lastmod>2025-06-25T12:37:09.294Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.profiler.html</loc>
<lastmod>2025-06-24T19:02:47.333Z</lastmod>
<lastmod>2025-06-25T12:37:10.123Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.data.pretraining.html</loc>
<lastmod>2025-06-24T19:02:46.962Z</lastmod>
<lastmod>2025-06-25T12:37:09.749Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.lisa.html</loc>
<lastmod>2025-06-24T19:02:47.334Z</lastmod>
<lastmod>2025-06-25T12:37:10.124Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.trainer.html</loc>
<lastmod>2025-06-24T19:02:46.906Z</lastmod>
<lastmod>2025-06-25T12:37:09.692Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.cut_cross_entropy.args.html</loc>
<lastmod>2025-06-24T19:02:47.216Z</lastmod>
<lastmod>2025-06-25T12:37:10.007Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.model.html</loc>
<lastmod>2025-06-24T19:02:47.010Z</lastmod>
<lastmod>2025-06-25T12:37:09.798Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.data.batch_dataset_fetcher.html</loc>
<lastmod>2025-06-24T19:02:46.815Z</lastmod>
<lastmod>2025-06-25T12:37:09.605Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.zephyr.html</loc>
<lastmod>2025-06-24T19:02:46.545Z</lastmod>
<lastmod>2025-06-25T12:37:09.332Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/datasets.html</loc>
<lastmod>2025-06-24T19:02:45.927Z</lastmod>
<lastmod>2025-06-25T12:37:08.707Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.enums.html</loc>
<lastmod>2025-06-24T19:02:47.072Z</lastmod>
<lastmod>2025-06-25T12:37:09.861Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.kd.trainer.html</loc>
<lastmod>2025-06-24T19:02:47.224Z</lastmod>
<lastmod>2025-06-25T12:37:10.015Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.lora_kernels.html</loc>
<lastmod>2025-06-24T19:02:46.785Z</lastmod>
<lastmod>2025-06-25T12:37:09.575Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.collators.batching.html</loc>
<lastmod>2025-06-24T19:02:47.274Z</lastmod>
<lastmod>2025-06-25T12:37:10.064Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.grpo.sampler.html</loc>
<lastmod>2025-06-24T19:02:46.325Z</lastmod>
<lastmod>2025-06-25T12:37:09.110Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.base.html</loc>
<lastmod>2025-06-24T19:02:46.401Z</lastmod>
<lastmod>2025-06-25T12:37:09.189Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.multipack.html</loc>
<lastmod>2025-06-24T19:02:46.749Z</lastmod>
<lastmod>2025-06-25T12:37:09.538Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.orpo.chat_template.html</loc>
<lastmod>2025-06-24T19:02:46.589Z</lastmod>
<lastmod>2025-06-25T12:37:09.374Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.passthrough.html</loc>
<lastmod>2025-06-24T19:02:46.548Z</lastmod>
<lastmod>2025-06-25T12:37:09.335Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.chatml.html</loc>
<lastmod>2025-06-24T19:02:46.049Z</lastmod>
<lastmod>2025-06-25T12:37:08.828Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.scheduler.html</loc>
<lastmod>2025-06-24T19:02:46.377Z</lastmod>
<lastmod>2025-06-25T12:37:09.163Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.model_shard_quant.html</loc>
<lastmod>2025-06-24T19:02:46.877Z</lastmod>
<lastmod>2025-06-25T12:37:09.663Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.chatml.html</loc>
<lastmod>2025-06-24T19:02:46.567Z</lastmod>
<lastmod>2025-06-25T12:37:09.352Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.tokenization.html</loc>
<lastmod>2025-06-24T19:02:46.855Z</lastmod>
<lastmod>2025-06-25T12:37:09.642Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.model.html</loc>
<lastmod>2025-06-24T19:02:46.336Z</lastmod>
<lastmod>2025-06-25T12:37:09.121Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.mlflow_.html</loc>
<lastmod>2025-06-24T19:02:47.338Z</lastmod>
<lastmod>2025-06-25T12:37:10.128Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.grpo.trainer.html</loc>
<lastmod>2025-06-24T19:02:46.313Z</lastmod>
<lastmod>2025-06-25T12:37:09.097Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.main.html</loc>
<lastmod>2025-06-24T19:02:46.097Z</lastmod>
<lastmod>2025-06-25T12:37:08.876Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.callbacks.comet_.html</loc>
<lastmod>2025-06-24T19:02:47.342Z</lastmod>
<lastmod>2025-06-25T12:37:10.131Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.chat_templates.html</loc>
<lastmod>2025-06-24T19:02:46.865Z</lastmod>
<lastmod>2025-06-25T12:37:09.652Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.utils.html</loc>
<lastmod>2025-06-24T19:02:47.077Z</lastmod>
<lastmod>2025-06-25T12:37:09.867Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/common.architectures.html</loc>
<lastmod>2025-06-24T19:02:47.236Z</lastmod>
<lastmod>2025-06-25T12:37:10.026Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_expand_mask.html</loc>
<lastmod>2025-06-24T19:02:46.757Z</lastmod>
<lastmod>2025-06-25T12:37:09.546Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_chat.html</loc>
<lastmod>2025-06-24T19:02:46.447Z</lastmod>
<lastmod>2025-06-25T12:37:09.236Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.samplers.multipack.html</loc>
<lastmod>2025-06-24T19:02:47.323Z</lastmod>
<lastmod>2025-06-25T12:37:10.113Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.grokfast.optimizer.html</loc>
<lastmod>2025-06-24T19:02:47.217Z</lastmod>
<lastmod>2025-06-25T12:37:10.008Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chatml.html</loc>
<lastmod>2025-06-24T19:02:46.543Z</lastmod>
<lastmod>2025-06-25T12:37:09.331Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.mixtral.html</loc>
<lastmod>2025-06-24T19:02:46.816Z</lastmod>
<lastmod>2025-06-25T12:37:09.606Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/train.html</loc>
<lastmod>2025-06-24T19:02:45.906Z</lastmod>
<lastmod>2025-06-25T12:37:08.685Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.llama_patch_multipack.html</loc>
<lastmod>2025-06-24T19:02:46.796Z</lastmod>
<lastmod>2025-06-25T12:37:09.586Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/index.html</loc>
<lastmod>2025-06-24T19:02:45.844Z</lastmod>
<lastmod>2025-06-25T12:37:08.622Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.adapter.html</loc>
<lastmod>2025-06-24T19:02:46.352Z</lastmod>
<lastmod>2025-06-25T12:37:09.137Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.schemas.multimodal.html</loc>
<lastmod>2025-06-24T19:02:47.049Z</lastmod>
<lastmod>2025-06-25T12:37:09.838Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/kernels.lora.html</loc>
<lastmod>2025-06-24T19:02:46.676Z</lastmod>
<lastmod>2025-06-25T12:37:09.463Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.kto.llama3.html</loc>
<lastmod>2025-06-24T19:02:46.558Z</lastmod>
<lastmod>2025-06-25T12:37:09.343Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.checks.html</loc>
<lastmod>2025-06-24T19:02:46.139Z</lastmod>
<lastmod>2025-06-25T12:37:08.920Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.quantize.html</loc>
<lastmod>2025-06-24T19:02:46.259Z</lastmod>
<lastmod>2025-06-25T12:37:09.043Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/integrations.lm_eval.args.html</loc>
<lastmod>2025-06-24T19:02:47.231Z</lastmod>
<lastmod>2025-06-25T12:37:10.021Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.chat.messages.html</loc>
<lastmod>2025-06-24T19:02:46.047Z</lastmod>
<lastmod>2025-06-25T12:37:08.826Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.builders.causal.html</loc>
<lastmod>2025-06-24T19:02:46.004Z</lastmod>
<lastmod>2025-06-25T12:37:08.783Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.relora.html</loc>
<lastmod>2025-06-24T19:02:46.296Z</lastmod>
<lastmod>2025-06-25T12:37:09.079Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/models.mamba.modeling_mamba.html</loc>
<lastmod>2025-06-24T19:02:47.254Z</lastmod>
<lastmod>2025-06-25T12:37:10.044Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_cpu.html</loc>
<lastmod>2025-06-24T19:02:46.820Z</lastmod>
<lastmod>2025-06-25T12:37:09.610Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mamba.html</loc>
<lastmod>2025-06-24T19:02:46.291Z</lastmod>
<lastmod>2025-06-25T12:37:09.074Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.datasets.transforms.chat_builder.html</loc>
<lastmod>2025-06-24T19:02:46.065Z</lastmod>
<lastmod>2025-06-25T12:37:08.844Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/loaders.processor.html</loc>
<lastmod>2025-06-24T19:02:46.346Z</lastmod>
<lastmod>2025-06-25T12:37:09.131Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.chat.format.llama3x.html</loc>
<lastmod>2025-06-24T19:02:46.050Z</lastmod>
<lastmod>2025-06-25T12:37:08.829Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.messages.chat.html</loc>
<lastmod>2025-06-24T19:02:46.519Z</lastmod>
<lastmod>2025-06-25T12:37:09.309Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/cli.train.html</loc>
<lastmod>2025-06-24T19:02:46.105Z</lastmod>
<lastmod>2025-06-25T12:37:08.885Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.trainers.mixins.optimizer.html</loc>
<lastmod>2025-06-24T19:02:46.367Z</lastmod>
<lastmod>2025-06-25T12:37:09.153Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.collators.mamba.html</loc>
<lastmod>2025-06-24T19:02:47.277Z</lastmod>
<lastmod>2025-06-25T12:37:10.067Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/monkeypatch.unsloth_.html</loc>
<lastmod>2025-06-24T19:02:46.813Z</lastmod>
<lastmod>2025-06-25T12:37:09.603Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/utils.dict.html</loc>
<lastmod>2025-06-24T19:02:46.953Z</lastmod>
<lastmod>2025-06-25T12:37:09.740Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.user_defined.html</loc>
<lastmod>2025-06-24T19:02:46.469Z</lastmod>
<lastmod>2025-06-25T12:37:09.258Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/core.training_args.html</loc>
<lastmod>2025-06-24T19:02:46.024Z</lastmod>
<lastmod>2025-06-25T12:37:08.803Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.user_defined.html</loc>
<lastmod>2025-06-24T19:02:46.546Z</lastmod>
<lastmod>2025-06-25T12:37:09.334Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/prompt_tokenizers.html</loc>
<lastmod>2025-06-24T19:02:45.984Z</lastmod>
<lastmod>2025-06-25T12:37:08.762Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/api/common.const.html</loc>
<lastmod>2025-06-24T19:02:47.238Z</lastmod>
<lastmod>2025-06-25T12:37:10.028Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/fsdp_qlora.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/custom_integrations.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/getting-started.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/faq.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/lora_optims.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/rlhf.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/amd_hpc.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/installation.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/multipack.html</loc>
<lastmod>2025-06-24T18:59:39.481Z</lastmod>
<lastmod>2025-06-25T12:34:07.138Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset_preprocessing.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset_loading.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/inst_tune.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/template_free.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.135Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/index.html</loc>
<lastmod>2025-06-24T18:59:39.477Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
<url>
<loc>https://docs.axolotl.ai/docs/dataset-formats/pretraining.html</loc>
<lastmod>2025-06-24T18:59:39.478Z</lastmod>
<lastmod>2025-06-25T12:34:07.134Z</lastmod>
</url>
</urlset>