From 857c949129ed1552c2e5e63d0d4c09b962f8ef09 Mon Sep 17 00:00:00 2001 From: Quarto GHA Workflow Runner Date: Wed, 25 Mar 2026 15:27:19 +0000 Subject: [PATCH] Built site for gh-pages --- .nojekyll | 2 +- docs/api/integrations.base.html | 447 ++++++++++++++++++++---------- search.json | 4 +- sitemap.xml | 470 ++++++++++++++++---------------- 4 files changed, 536 insertions(+), 387 deletions(-) diff --git a/.nojekyll b/.nojekyll index 2b1d454b3..178731cea 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -8efe4c4d \ No newline at end of file +62d50f63 \ No newline at end of file diff --git a/docs/api/integrations.base.html b/docs/api/integrations.base.html index b30b9b7db..1530bb9f0 100644 --- a/docs/api/integrations.base.html +++ b/docs/api/integrations.base.html @@ -921,38 +921,42 @@ training.

Loads and preprocesses the dataset for training. +on_rollouts_scored +Called after rollouts are scored during online RL (GRPO/PPO). + + post_lora_load Performs actions after LoRA weights are loaded. - + post_model_build Performs actions after the model is built/loaded, but before any adapters are applied. - + post_model_load Performs actions after the model is loaded. - + post_train Performs actions after training is complete. - + post_train_unload Performs actions after training is complete and the model is unloaded. - + post_trainer_create Performs actions after the trainer is created. - + pre_lora_load Performs actions before LoRA weights are loaded. - + pre_model_load Performs actions before the model is loaded. - + register Registers the plugin with the given configuration as an unparsed dict. @@ -1445,14 +1449,86 @@ callbacks that require access to the model or trainer.

-
-
post_lora_load
-
integrations.base.BasePlugin.post_lora_load(cfg, model)
-

Performs actions after LoRA weights are loaded.

+
+
on_rollouts_scored
+
integrations.base.BasePlugin.on_rollouts_scored(
+    cfg,
+    trainer,
+    prompts,
+    completions,
+    rewards,
+    advantages,
+)
+

Called after rollouts are scored during online RL (GRPO/PPO).

+

Provides access to the full scored rollout data for logging, trace +storage, or analysis. Called once per scoring step with all samples +from that step.

Parameters
+++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
NameTypeDescriptionDefault
cfgDictDefaultThe axolotl configuration.required
trainerThe trainer instance.required
promptslist[str]List of prompt texts (one per sample).required
completionslist[str]List of completion texts (one per sample).required
rewardsdict[str, list[float]]Dict mapping reward function name to list of reward values.required
advantageslist[float]List of advantage values (one per sample).required
+
+
+
+
post_lora_load
+
integrations.base.BasePlugin.post_lora_load(cfg, model)
+

Performs actions after LoRA weights are loaded.

+
+
Parameters
+ +@@ -1485,10 +1561,10 @@ callbacks that require access to the model or trainer.

post_model_build
-
integrations.base.BasePlugin.post_model_build(cfg, model)
+
integrations.base.BasePlugin.post_model_build(cfg, model)

Performs actions after the model is built/loaded, but before any adapters are applied.

-
-
Parameters
+
+
Parameters
@@ -1517,10 +1593,10 @@ callbacks that require access to the model or trainer.

post_model_load
-
integrations.base.BasePlugin.post_model_load(cfg, model)
+
integrations.base.BasePlugin.post_model_load(cfg, model)

Performs actions after the model is loaded.

-
-
Parameters
+
+
Parameters
@@ -1555,10 +1631,10 @@ callbacks that require access to the model or trainer.

post_train
-
integrations.base.BasePlugin.post_train(cfg, model)
+
integrations.base.BasePlugin.post_train(cfg, model)

Performs actions after training is complete.

-
-
Parameters
+
+
Parameters
@@ -1593,10 +1669,10 @@ callbacks that require access to the model or trainer.

post_train_unload
-
integrations.base.BasePlugin.post_train_unload(cfg)
+
integrations.base.BasePlugin.post_train_unload(cfg)

Performs actions after training is complete and the model is unloaded.

-
-
Parameters
+
+
Parameters
@@ -1625,10 +1701,10 @@ callbacks that require access to the model or trainer.

post_trainer_create
-
integrations.base.BasePlugin.post_trainer_create(cfg, trainer)
+
integrations.base.BasePlugin.post_trainer_create(cfg, trainer)

Performs actions after the trainer is created.

-
-
Parameters
+
+
Parameters
@@ -1663,10 +1739,10 @@ callbacks that require access to the model or trainer.

pre_lora_load
-
integrations.base.BasePlugin.pre_lora_load(cfg, model)
+
integrations.base.BasePlugin.pre_lora_load(cfg, model)

Performs actions before LoRA weights are loaded.

-
-
Parameters
+
+
Parameters
@@ -1701,10 +1777,10 @@ callbacks that require access to the model or trainer.

pre_model_load
-
integrations.base.BasePlugin.pre_model_load(cfg)
+
integrations.base.BasePlugin.pre_model_load(cfg)

Performs actions before the model is loaded.

-
-
Parameters
+
+
Parameters
@@ -1733,10 +1809,10 @@ callbacks that require access to the model or trainer.

register
-
integrations.base.BasePlugin.register(cfg)
+
integrations.base.BasePlugin.register(cfg)

Registers the plugin with the given configuration as an unparsed dict.

-
-
Parameters
+
+
Parameters
@@ -1761,7 +1837,7 @@ callbacks that require access to the model or trainer.

PluginManager

-
integrations.base.PluginManager()
+
integrations.base.PluginManager()

The PluginManager class is responsible for loading and managing plugins. It should be a singleton so it can be accessed from anywhere in the codebase.

@@ -1845,38 +1921,42 @@ should be a singleton so it can be accessed from anywhere in the codebase.

+ + + + - + - + - + - + - + - + - + - + @@ -1884,10 +1964,10 @@ should be a singleton so it can be accessed from anywhere in the codebase.

Calls the load_datasets method of each registered plugin.
on_rollouts_scoredCalls the on_rollouts_scored method of all registered plugins.
post_lora_load Calls the post_lora_load method of all registered plugins.
post_model_build Calls the post_model_build method of all registered plugins after the
post_model_load Calls the post_model_load method of all registered plugins after the model
post_train Calls the post_train method of all registered plugins.
post_train_unload Calls the post_train_unload method of all registered plugins.
post_trainer_create Calls the post_trainer_create method of all registered plugins.
pre_lora_load Calls the pre_lora_load method of all registered plugins.
pre_model_load Calls the pre_model_load method of all registered plugins.
register Registers a new plugin by its name.
add_callbacks_post_trainer
-
integrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)
+
integrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)

Calls the add_callbacks_post_trainer method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -1946,10 +2026,10 @@ should be a singleton so it can be accessed from anywhere in the codebase.

add_callbacks_pre_trainer
-
integrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)
+
integrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)

Calls the add_callbacks_pre_trainer method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -2008,15 +2088,15 @@ should be a singleton so it can be accessed from anywhere in the codebase.

create_lr_scheduler
-
integrations.base.PluginManager.create_lr_scheduler(
-    trainer,
-    optimizer,
-    num_training_steps,
-)
+
integrations.base.PluginManager.create_lr_scheduler(
+    trainer,
+    optimizer,
+    num_training_steps,
+)

Calls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.

-
-
Parameters
+
+
Parameters
@@ -2075,11 +2155,11 @@ the first non-None scheduler.

create_optimizer
-
integrations.base.PluginManager.create_optimizer(trainer)
+
integrations.base.PluginManager.create_optimizer(trainer)

Calls the create_optimizer method of all registered plugins and returns the first non-None optimizer.

-
-
Parameters
+
+
Parameters
@@ -2126,7 +2206,7 @@ the first non-None optimizer.

get_collator_cls_and_kwargs
-
integrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)
+
integrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)

Calls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.

Parameters: cfg (dict): The configuration for the plugins. @@ -2136,7 +2216,7 @@ object: The collator class, or None if none was found.

get_input_args
-
integrations.base.PluginManager.get_input_args()
+
integrations.base.PluginManager.get_input_args()

Returns a list of Pydantic classes for all registered plugins’ input arguments.’

Returns
@@ -2165,17 +2245,17 @@ object: The collator class, or None if none was found.

get_instance
-
integrations.base.PluginManager.get_instance()
+
integrations.base.PluginManager.get_instance()

Returns the singleton instance of PluginManager. If the instance doesn’t exist, it creates a new one.

get_trainer_cls
-
integrations.base.PluginManager.get_trainer_cls(cfg)
+
integrations.base.PluginManager.get_trainer_cls(cfg)

Calls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.

-
-
Parameters
+
+
Parameters
@@ -2228,7 +2308,7 @@ first non-None trainer class.

get_training_args
-
integrations.base.PluginManager.get_training_args(cfg)
+
integrations.base.PluginManager.get_training_args(cfg)

Calls the get_training_args method of all registered plugins and returns the combined training arguments.

Parameters: cfg (dict): The configuration for the plugins.

@@ -2237,17 +2317,17 @@ object: The training arguments

get_training_args_mixin
-
integrations.base.PluginManager.get_training_args_mixin()
+
integrations.base.PluginManager.get_training_args_mixin()

Returns a list of dataclasses for all registered plugins’ training args mixins’

Returns: list[str]: A list of dataclsses

load_datasets
-
integrations.base.PluginManager.load_datasets(cfg, preprocess=False)
+
integrations.base.PluginManager.load_datasets(cfg, preprocess=False)

Calls the load_datasets method of each registered plugin.

-
-
Parameters
+
+
Parameters
@@ -2304,57 +2384,25 @@ list[str]: A list of dataclsses

-
-
post_lora_load
-
integrations.base.PluginManager.post_lora_load(cfg, model)
-

Calls the post_lora_load method of all registered plugins.

-
-
Parameters
- ------ - - - - - - - - - - - - - - - - - - - - - - -
NameTypeDescriptionDefault
cfgDictDefaultThe configuration for the plugins.required
modelPreTrainedModel | PeftModelThe loaded model.required
-
-
-
-
post_model_build
-
integrations.base.PluginManager.post_model_build(cfg, model)
-

Calls the post_model_build method of all registered plugins after the -model has been built / loaded, but before any adapters have been applied.

+
+
on_rollouts_scored
+
integrations.base.PluginManager.on_rollouts_scored(
+    cfg,
+    trainer,
+    prompts,
+    completions,
+    rewards,
+    advantages,
+)
+

Calls the on_rollouts_scored method of all registered plugins.

Parameters
----++++ @@ -2372,20 +2420,43 @@ model has been built / loaded, but before any adapters have been applied.

- - - + + + + + + + + + + + + + + + + + + + + + + + + + + +
required
modelPreTrainedModelThe loaded model.trainerThe trainer instance.required
promptslist[str]List of prompt texts.required
completionslist[str]List of completion texts.required
rewardsdict[str, list[float]]Dict mapping reward function name to list of rewards.required
advantageslist[float]List of advantage values. required
-
-
post_model_load
-
integrations.base.PluginManager.post_model_load(cfg, model)
-

Calls the post_model_load method of all registered plugins after the model -has been loaded inclusive of any adapters.

+
+
post_lora_load
+
integrations.base.PluginManager.post_lora_load(cfg, model)
+

Calls the post_lora_load method of all registered plugins.

Parameters
@@ -2420,14 +2491,92 @@ has been loaded inclusive of any adapters.

-
-
post_train
-
integrations.base.PluginManager.post_train(cfg, model)
-

Calls the post_train method of all registered plugins.

+
+
post_model_build
+
integrations.base.PluginManager.post_model_build(cfg, model)
+

Calls the post_model_build method of all registered plugins after the +model has been built / loaded, but before any adapters have been applied.

Parameters
+++++ + + + + + + + + + + + + + + + + + + + + + + +
NameTypeDescriptionDefault
cfgDictDefaultThe configuration for the plugins.required
modelPreTrainedModelThe loaded model.required
+
+
+
+
post_model_load
+
integrations.base.PluginManager.post_model_load(cfg, model)
+

Calls the post_model_load method of all registered plugins after the model +has been loaded inclusive of any adapters.

+
+
Parameters
+ ++++++ + + + + + + + + + + + + + + + + + + + + + + +
NameTypeDescriptionDefault
cfgDictDefaultThe configuration for the plugins.required
modelPreTrainedModel | PeftModelThe loaded model.required
+
+
+
+
post_train
+
integrations.base.PluginManager.post_train(cfg, model)
+

Calls the post_train method of all registered plugins.

+
+
Parameters
+ +@@ -2460,10 +2609,10 @@ has been loaded inclusive of any adapters.

post_train_unload
-
integrations.base.PluginManager.post_train_unload(cfg)
+
integrations.base.PluginManager.post_train_unload(cfg)

Calls the post_train_unload method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -2492,10 +2641,10 @@ has been loaded inclusive of any adapters.

post_trainer_create
-
integrations.base.PluginManager.post_trainer_create(cfg, trainer)
+
integrations.base.PluginManager.post_trainer_create(cfg, trainer)

Calls the post_trainer_create method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -2530,10 +2679,10 @@ has been loaded inclusive of any adapters.

pre_lora_load
-
integrations.base.PluginManager.pre_lora_load(cfg, model)
+
integrations.base.PluginManager.pre_lora_load(cfg, model)

Calls the pre_lora_load method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -2568,10 +2717,10 @@ has been loaded inclusive of any adapters.

pre_model_load
-
integrations.base.PluginManager.pre_model_load(cfg)
+
integrations.base.PluginManager.pre_model_load(cfg)

Calls the pre_model_load method of all registered plugins.

-
-
Parameters
+
+
Parameters
@@ -2600,10 +2749,10 @@ has been loaded inclusive of any adapters.

register
-
integrations.base.PluginManager.register(plugin_name)
+
integrations.base.PluginManager.register(plugin_name)

Registers a new plugin by its name.

-
-
Parameters
+
+
Parameters
@@ -2670,13 +2819,13 @@ has been loaded inclusive of any adapters.

load_plugin

-
integrations.base.load_plugin(plugin_name)
+
integrations.base.load_plugin(plugin_name)

Loads a plugin based on the given plugin name.

The plugin name should be in the format “module_name.class_name”. This function splits the plugin name into module and class, imports the module, retrieves the class from the module, and creates an instance of the class.

-
-

Parameters

+
+

Parameters

diff --git a/search.json b/search.json index 316d6a1d7..546d988a6 100644 --- a/search.json +++ b/search.json @@ -4663,14 +4663,14 @@ "href": "docs/api/integrations.base.html", "title": "integrations.base", "section": "", - "text": "integrations.base\nBase class for all plugins.\nA plugin is a reusable, modular, and self-contained piece of code that extends the functionality of Axolotl.\nPlugins can be used to integrate third-party models, modify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and implement the required methods.\n\n\n\n\n\nName\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins. It\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\n\n\nName\nDescription\n\n\n\n\nget_decay_parameter_names\nGet all parameter names that weight decay will be applied to.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory.get_decay_parameter_names(model)\nGet all parameter names that weight decay will be applied to.\nThis function filters out parameters in two ways:\n1. By layer type (instances of layers specified in ALL_LAYERNORM_LAYERS)\n2. By parameter name patterns (containing ‘bias’, or variation of ‘norm’)\n\n\n\n\n\nintegrations.base.BasePlugin()\nBase class for all plugins. Defines the interface for plugin methods.\nA plugin is a reusable, modular, and self-contained piece of code that extends\nthe functionality of Axolotl. Plugins can be used to integrate third-party models,\nmodify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and\nimplement the required methods.\n\n\nPlugin methods include:\n- register(cfg): Registers the plugin with the given configuration.\n- load_datasets(cfg): Loads and preprocesses the dataset for training.\n- pre_model_load(cfg): Performs actions before the model is loaded.\n- post_model_build(cfg, model): Performs actions after the model is loaded, but\nbefore LoRA adapters are applied.\n- pre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\n- post_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\n- post_model_load(cfg, model): Performs actions after the model is loaded,\ninclusive of any adapters.\n- post_trainer_create(cfg, trainer): Performs actions after the trainer is\ncreated.\n- create_optimizer(cfg, trainer): Creates and returns an optimizer for training.\n- create_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and\nreturns a learning rate scheduler.\n- add_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before\ntraining.\n- add_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after\ntraining.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer. This is useful for\n\n\nadd_callbacks_pre_trainer\nSet up callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_collator_cls_and_kwargs\nReturns a custom class for the collator.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\nget_training_args\nReturns custom training arguments to set on TrainingArgs.\n\n\nget_training_args_mixin\nReturns a dataclass model for the plugin’s training arguments.\n\n\nload_datasets\nLoads and preprocesses the dataset for training.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npost_trainer_create\nPerforms actions after the trainer is created.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer. This is useful for\ncallbacks that require access to the model or trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nSet up callbacks before creating the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\nnum_training_steps\nint\nTotal number of training steps\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_collator_cls_and_kwargs(cfg, is_eval=False)\nReturns a custom class for the collator.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\nis_eval\nbool\nWhether this is an eval split.\nFalse\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nclass\n\nThe class for the collator.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntype[Trainer] | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args(cfg)\nReturns custom training arguments to set on TrainingArgs.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nobject\n\ndict containing the training arguments.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args_mixin()\nReturns a dataclass model for the plugin’s training arguments.\n\n\n\nintegrations.base.BasePlugin.load_datasets(cfg, preprocess=False)\nLoads and preprocesses the dataset for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\npreprocess\nbool\nWhether this is the preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\ndataset_meta\nUnion['TrainDatasetMeta', None]\nThe metadata for the training dataset.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_trainer_create(cfg, trainer)\nPerforms actions after the trainer is created.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins. It\nshould be a singleton so it can be accessed from anywhere in the codebase.\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nplugins\nOrderedDict[str, BasePlugin]\nA list of loaded plugins.\n\n\n\n\n\n\nKey methods include:\n- get_instance(): Static method to get the singleton instance of PluginManager.\n- register(plugin_name: str): Registers a new plugin by its name.\n- pre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns\n\n\nget_collator_cls_and_kwargs\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager. If the instance doesn’t\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the\n\n\nget_training_args\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\n\n\nget_training_args_mixin\nReturns a list of dataclasses for all registered plugins’ training args mixins’\n\n\nload_datasets\nCalls the load_datasets method of each registered plugin.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npost_trainer_create\nCalls the post_trainer_create method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns\nthe first non-None scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler, or None if not found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns\nthe first non-None optimizer.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer, or None if none was found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\nParameters:\ncfg (dict): The configuration for the plugins.\nis_eval (bool): Whether this is an eval split.\nReturns:\nobject: The collator class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[str]\nA list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager. If the instance doesn’t\nexist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the\nfirst non-None trainer class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTrainer | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_training_args(cfg)\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The training arguments\n\n\n\nintegrations.base.PluginManager.get_training_args_mixin()\nReturns a list of dataclasses for all registered plugins’ training args mixins’\nReturns:\nlist[str]: A list of dataclsses\n\n\n\nintegrations.base.PluginManager.load_datasets(cfg, preprocess=False)\nCalls the load_datasets method of each registered plugin.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\npreprocess\nbool\nWhether this is preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nUnion['TrainDatasetMeta', None]\nThe dataset metadata loaded from all registered plugins.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the\nmodel has been built / loaded, but before any adapters have been applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model\nhas been loaded inclusive of any adapters.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_trainer_create(cfg, trainer)\nCalls the post_trainer_create method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be registered.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported.\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nload_plugin\nLoads a plugin based on the given plugin name.\n\n\n\n\n\nintegrations.base.load_plugin(plugin_name)\nLoads a plugin based on the given plugin name.\nThe plugin name should be in the format “module_name.class_name”. This function\nsplits the plugin name into module and class, imports the module, retrieves the\nclass from the module, and creates an instance of the class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be loaded. The name should be in the format “module_name.class_name”.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nBasePlugin\nAn instance of the loaded plugin.\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported." + "text": "integrations.base\nBase class for all plugins.\nA plugin is a reusable, modular, and self-contained piece of code that extends the functionality of Axolotl.\nPlugins can be used to integrate third-party models, modify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and implement the required methods.\n\n\n\n\n\nName\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins. It\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\n\n\nName\nDescription\n\n\n\n\nget_decay_parameter_names\nGet all parameter names that weight decay will be applied to.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory.get_decay_parameter_names(model)\nGet all parameter names that weight decay will be applied to.\nThis function filters out parameters in two ways:\n1. By layer type (instances of layers specified in ALL_LAYERNORM_LAYERS)\n2. By parameter name patterns (containing ‘bias’, or variation of ‘norm’)\n\n\n\n\n\nintegrations.base.BasePlugin()\nBase class for all plugins. Defines the interface for plugin methods.\nA plugin is a reusable, modular, and self-contained piece of code that extends\nthe functionality of Axolotl. Plugins can be used to integrate third-party models,\nmodify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and\nimplement the required methods.\n\n\nPlugin methods include:\n- register(cfg): Registers the plugin with the given configuration.\n- load_datasets(cfg): Loads and preprocesses the dataset for training.\n- pre_model_load(cfg): Performs actions before the model is loaded.\n- post_model_build(cfg, model): Performs actions after the model is loaded, but\nbefore LoRA adapters are applied.\n- pre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\n- post_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\n- post_model_load(cfg, model): Performs actions after the model is loaded,\ninclusive of any adapters.\n- post_trainer_create(cfg, trainer): Performs actions after the trainer is\ncreated.\n- create_optimizer(cfg, trainer): Creates and returns an optimizer for training.\n- create_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and\nreturns a learning rate scheduler.\n- add_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before\ntraining.\n- add_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after\ntraining.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer. This is useful for\n\n\nadd_callbacks_pre_trainer\nSet up callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_collator_cls_and_kwargs\nReturns a custom class for the collator.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\nget_training_args\nReturns custom training arguments to set on TrainingArgs.\n\n\nget_training_args_mixin\nReturns a dataclass model for the plugin’s training arguments.\n\n\nload_datasets\nLoads and preprocesses the dataset for training.\n\n\non_rollouts_scored\nCalled after rollouts are scored during online RL (GRPO/PPO).\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npost_trainer_create\nPerforms actions after the trainer is created.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer. This is useful for\ncallbacks that require access to the model or trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nSet up callbacks before creating the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\nnum_training_steps\nint\nTotal number of training steps\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_collator_cls_and_kwargs(cfg, is_eval=False)\nReturns a custom class for the collator.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\nis_eval\nbool\nWhether this is an eval split.\nFalse\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nclass\n\nThe class for the collator.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntype[Trainer] | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args(cfg)\nReturns custom training arguments to set on TrainingArgs.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nobject\n\ndict containing the training arguments.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args_mixin()\nReturns a dataclass model for the plugin’s training arguments.\n\n\n\nintegrations.base.BasePlugin.load_datasets(cfg, preprocess=False)\nLoads and preprocesses the dataset for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\npreprocess\nbool\nWhether this is the preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\ndataset_meta\nUnion['TrainDatasetMeta', None]\nThe metadata for the training dataset.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.on_rollouts_scored(\n cfg,\n trainer,\n prompts,\n completions,\n rewards,\n advantages,\n)\nCalled after rollouts are scored during online RL (GRPO/PPO).\nProvides access to the full scored rollout data for logging, trace\nstorage, or analysis. Called once per scoring step with all samples\nfrom that step.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\ntrainer\n\nThe trainer instance.\nrequired\n\n\nprompts\nlist[str]\nList of prompt texts (one per sample).\nrequired\n\n\ncompletions\nlist[str]\nList of completion texts (one per sample).\nrequired\n\n\nrewards\ndict[str, list[float]]\nDict mapping reward function name to list of reward values.\nrequired\n\n\nadvantages\nlist[float]\nList of advantage values (one per sample).\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_trainer_create(cfg, trainer)\nPerforms actions after the trainer is created.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins. It\nshould be a singleton so it can be accessed from anywhere in the codebase.\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nplugins\nOrderedDict[str, BasePlugin]\nA list of loaded plugins.\n\n\n\n\n\n\nKey methods include:\n- get_instance(): Static method to get the singleton instance of PluginManager.\n- register(plugin_name: str): Registers a new plugin by its name.\n- pre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns\n\n\nget_collator_cls_and_kwargs\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager. If the instance doesn’t\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the\n\n\nget_training_args\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\n\n\nget_training_args_mixin\nReturns a list of dataclasses for all registered plugins’ training args mixins’\n\n\nload_datasets\nCalls the load_datasets method of each registered plugin.\n\n\non_rollouts_scored\nCalls the on_rollouts_scored method of all registered plugins.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npost_trainer_create\nCalls the post_trainer_create method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns\nthe first non-None scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler, or None if not found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns\nthe first non-None optimizer.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer, or None if none was found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\nParameters:\ncfg (dict): The configuration for the plugins.\nis_eval (bool): Whether this is an eval split.\nReturns:\nobject: The collator class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[str]\nA list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager. If the instance doesn’t\nexist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the\nfirst non-None trainer class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTrainer | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_training_args(cfg)\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The training arguments\n\n\n\nintegrations.base.PluginManager.get_training_args_mixin()\nReturns a list of dataclasses for all registered plugins’ training args mixins’\nReturns:\nlist[str]: A list of dataclsses\n\n\n\nintegrations.base.PluginManager.load_datasets(cfg, preprocess=False)\nCalls the load_datasets method of each registered plugin.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\npreprocess\nbool\nWhether this is preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nUnion['TrainDatasetMeta', None]\nThe dataset metadata loaded from all registered plugins.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.on_rollouts_scored(\n cfg,\n trainer,\n prompts,\n completions,\n rewards,\n advantages,\n)\nCalls the on_rollouts_scored method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\n\nThe trainer instance.\nrequired\n\n\nprompts\nlist[str]\nList of prompt texts.\nrequired\n\n\ncompletions\nlist[str]\nList of completion texts.\nrequired\n\n\nrewards\ndict[str, list[float]]\nDict mapping reward function name to list of rewards.\nrequired\n\n\nadvantages\nlist[float]\nList of advantage values.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the\nmodel has been built / loaded, but before any adapters have been applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model\nhas been loaded inclusive of any adapters.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_trainer_create(cfg, trainer)\nCalls the post_trainer_create method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be registered.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported.\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nload_plugin\nLoads a plugin based on the given plugin name.\n\n\n\n\n\nintegrations.base.load_plugin(plugin_name)\nLoads a plugin based on the given plugin name.\nThe plugin name should be in the format “module_name.class_name”. This function\nsplits the plugin name into module and class, imports the module, retrieves the\nclass from the module, and creates an instance of the class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be loaded. The name should be in the format “module_name.class_name”.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nBasePlugin\nAn instance of the loaded plugin.\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported." }, { "objectID": "docs/api/integrations.base.html#classes", "href": "docs/api/integrations.base.html#classes", "title": "integrations.base", "section": "", - "text": "Name\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins. It\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\n\n\nName\nDescription\n\n\n\n\nget_decay_parameter_names\nGet all parameter names that weight decay will be applied to.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory.get_decay_parameter_names(model)\nGet all parameter names that weight decay will be applied to.\nThis function filters out parameters in two ways:\n1. By layer type (instances of layers specified in ALL_LAYERNORM_LAYERS)\n2. By parameter name patterns (containing ‘bias’, or variation of ‘norm’)\n\n\n\n\n\nintegrations.base.BasePlugin()\nBase class for all plugins. Defines the interface for plugin methods.\nA plugin is a reusable, modular, and self-contained piece of code that extends\nthe functionality of Axolotl. Plugins can be used to integrate third-party models,\nmodify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and\nimplement the required methods.\n\n\nPlugin methods include:\n- register(cfg): Registers the plugin with the given configuration.\n- load_datasets(cfg): Loads and preprocesses the dataset for training.\n- pre_model_load(cfg): Performs actions before the model is loaded.\n- post_model_build(cfg, model): Performs actions after the model is loaded, but\nbefore LoRA adapters are applied.\n- pre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\n- post_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\n- post_model_load(cfg, model): Performs actions after the model is loaded,\ninclusive of any adapters.\n- post_trainer_create(cfg, trainer): Performs actions after the trainer is\ncreated.\n- create_optimizer(cfg, trainer): Creates and returns an optimizer for training.\n- create_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and\nreturns a learning rate scheduler.\n- add_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before\ntraining.\n- add_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after\ntraining.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer. This is useful for\n\n\nadd_callbacks_pre_trainer\nSet up callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_collator_cls_and_kwargs\nReturns a custom class for the collator.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\nget_training_args\nReturns custom training arguments to set on TrainingArgs.\n\n\nget_training_args_mixin\nReturns a dataclass model for the plugin’s training arguments.\n\n\nload_datasets\nLoads and preprocesses the dataset for training.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npost_trainer_create\nPerforms actions after the trainer is created.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer. This is useful for\ncallbacks that require access to the model or trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nSet up callbacks before creating the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\nnum_training_steps\nint\nTotal number of training steps\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_collator_cls_and_kwargs(cfg, is_eval=False)\nReturns a custom class for the collator.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\nis_eval\nbool\nWhether this is an eval split.\nFalse\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nclass\n\nThe class for the collator.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntype[Trainer] | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args(cfg)\nReturns custom training arguments to set on TrainingArgs.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nobject\n\ndict containing the training arguments.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args_mixin()\nReturns a dataclass model for the plugin’s training arguments.\n\n\n\nintegrations.base.BasePlugin.load_datasets(cfg, preprocess=False)\nLoads and preprocesses the dataset for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\npreprocess\nbool\nWhether this is the preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\ndataset_meta\nUnion['TrainDatasetMeta', None]\nThe metadata for the training dataset.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_trainer_create(cfg, trainer)\nPerforms actions after the trainer is created.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins. It\nshould be a singleton so it can be accessed from anywhere in the codebase.\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nplugins\nOrderedDict[str, BasePlugin]\nA list of loaded plugins.\n\n\n\n\n\n\nKey methods include:\n- get_instance(): Static method to get the singleton instance of PluginManager.\n- register(plugin_name: str): Registers a new plugin by its name.\n- pre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns\n\n\nget_collator_cls_and_kwargs\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager. If the instance doesn’t\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the\n\n\nget_training_args\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\n\n\nget_training_args_mixin\nReturns a list of dataclasses for all registered plugins’ training args mixins’\n\n\nload_datasets\nCalls the load_datasets method of each registered plugin.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npost_trainer_create\nCalls the post_trainer_create method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns\nthe first non-None scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler, or None if not found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns\nthe first non-None optimizer.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer, or None if none was found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\nParameters:\ncfg (dict): The configuration for the plugins.\nis_eval (bool): Whether this is an eval split.\nReturns:\nobject: The collator class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[str]\nA list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager. If the instance doesn’t\nexist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the\nfirst non-None trainer class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTrainer | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_training_args(cfg)\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The training arguments\n\n\n\nintegrations.base.PluginManager.get_training_args_mixin()\nReturns a list of dataclasses for all registered plugins’ training args mixins’\nReturns:\nlist[str]: A list of dataclsses\n\n\n\nintegrations.base.PluginManager.load_datasets(cfg, preprocess=False)\nCalls the load_datasets method of each registered plugin.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\npreprocess\nbool\nWhether this is preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nUnion['TrainDatasetMeta', None]\nThe dataset metadata loaded from all registered plugins.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the\nmodel has been built / loaded, but before any adapters have been applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model\nhas been loaded inclusive of any adapters.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_trainer_create(cfg, trainer)\nCalls the post_trainer_create method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be registered.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported." + "text": "Name\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins. It\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\n\n\nName\nDescription\n\n\n\n\nget_decay_parameter_names\nGet all parameter names that weight decay will be applied to.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory.get_decay_parameter_names(model)\nGet all parameter names that weight decay will be applied to.\nThis function filters out parameters in two ways:\n1. By layer type (instances of layers specified in ALL_LAYERNORM_LAYERS)\n2. By parameter name patterns (containing ‘bias’, or variation of ‘norm’)\n\n\n\n\n\nintegrations.base.BasePlugin()\nBase class for all plugins. Defines the interface for plugin methods.\nA plugin is a reusable, modular, and self-contained piece of code that extends\nthe functionality of Axolotl. Plugins can be used to integrate third-party models,\nmodify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and\nimplement the required methods.\n\n\nPlugin methods include:\n- register(cfg): Registers the plugin with the given configuration.\n- load_datasets(cfg): Loads and preprocesses the dataset for training.\n- pre_model_load(cfg): Performs actions before the model is loaded.\n- post_model_build(cfg, model): Performs actions after the model is loaded, but\nbefore LoRA adapters are applied.\n- pre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\n- post_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\n- post_model_load(cfg, model): Performs actions after the model is loaded,\ninclusive of any adapters.\n- post_trainer_create(cfg, trainer): Performs actions after the trainer is\ncreated.\n- create_optimizer(cfg, trainer): Creates and returns an optimizer for training.\n- create_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and\nreturns a learning rate scheduler.\n- add_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before\ntraining.\n- add_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after\ntraining.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer. This is useful for\n\n\nadd_callbacks_pre_trainer\nSet up callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_collator_cls_and_kwargs\nReturns a custom class for the collator.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\nget_training_args\nReturns custom training arguments to set on TrainingArgs.\n\n\nget_training_args_mixin\nReturns a dataclass model for the plugin’s training arguments.\n\n\nload_datasets\nLoads and preprocesses the dataset for training.\n\n\non_rollouts_scored\nCalled after rollouts are scored during online RL (GRPO/PPO).\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npost_trainer_create\nPerforms actions after the trainer is created.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer. This is useful for\ncallbacks that require access to the model or trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nSet up callbacks before creating the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\nnum_training_steps\nint\nTotal number of training steps\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_collator_cls_and_kwargs(cfg, is_eval=False)\nReturns a custom class for the collator.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\nis_eval\nbool\nWhether this is an eval split.\nFalse\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nclass\n\nThe class for the collator.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\ntype[Trainer] | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args(cfg)\nReturns custom training arguments to set on TrainingArgs.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe global axolotl configuration.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nobject\n\ndict containing the training arguments.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.get_training_args_mixin()\nReturns a dataclass model for the plugin’s training arguments.\n\n\n\nintegrations.base.BasePlugin.load_datasets(cfg, preprocess=False)\nLoads and preprocesses the dataset for training.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\npreprocess\nbool\nWhether this is the preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\ndataset_meta\nUnion['TrainDatasetMeta', None]\nThe metadata for the training dataset.\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.on_rollouts_scored(\n cfg,\n trainer,\n prompts,\n completions,\n rewards,\n advantages,\n)\nCalled after rollouts are scored during online RL (GRPO/PPO).\nProvides access to the full scored rollout data for logging, trace\nstorage, or analysis. Called once per scoring step with all samples\nfrom that step.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\ntrainer\n\nThe trainer instance.\nrequired\n\n\nprompts\nlist[str]\nList of prompt texts (one per sample).\nrequired\n\n\ncompletions\nlist[str]\nList of completion texts (one per sample).\nrequired\n\n\nrewards\ndict[str, list[float]]\nDict mapping reward function name to list of reward values.\nrequired\n\n\nadvantages\nlist[float]\nList of advantage values (one per sample).\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe axolotl configuration.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_trainer_create(cfg, trainer)\nPerforms actions after the trainer is created.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration as an unparsed dict.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins. It\nshould be a singleton so it can be accessed from anywhere in the codebase.\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\nplugins\nOrderedDict[str, BasePlugin]\nA list of loaded plugins.\n\n\n\n\n\n\nKey methods include:\n- get_instance(): Static method to get the singleton instance of PluginManager.\n- register(plugin_name: str): Registers a new plugin by its name.\n- pre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns\n\n\nget_collator_cls_and_kwargs\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager. If the instance doesn’t\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the\n\n\nget_training_args\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\n\n\nget_training_args_mixin\nReturns a list of dataclasses for all registered plugins’ training args mixins’\n\n\nload_datasets\nCalls the load_datasets method of each registered plugin.\n\n\non_rollouts_scored\nCalls the on_rollouts_scored method of all registered plugins.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npost_trainer_create\nCalls the post_trainer_create method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[Callable]\nA list of callback functions to be added to the TrainingArgs.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns\nthe first non-None scheduler.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\noptimizer\nOptimizer\nThe optimizer for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nLRScheduler | None\nThe created learning rate scheduler, or None if not found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns\nthe first non-None optimizer.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nOptimizer | None\nThe created optimizer, or None if none was found.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_collator_cls_and_kwargs(cfg, is_eval=False)\nCalls the get_collator_cls_and_kwargs method of all registered plugins and returns the first non-None collator class.\nParameters:\ncfg (dict): The configuration for the plugins.\nis_eval (bool): Whether this is an eval split.\nReturns:\nobject: The collator class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nlist[str]\nA list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager. If the instance doesn’t\nexist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the\nfirst non-None trainer class.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nTrainer | None\nThe first non-None trainer class returned by a plugin.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.get_training_args(cfg)\nCalls the get_training_args method of all registered plugins and returns the combined training arguments.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The training arguments\n\n\n\nintegrations.base.PluginManager.get_training_args_mixin()\nReturns a list of dataclasses for all registered plugins’ training args mixins’\nReturns:\nlist[str]: A list of dataclsses\n\n\n\nintegrations.base.PluginManager.load_datasets(cfg, preprocess=False)\nCalls the load_datasets method of each registered plugin.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\npreprocess\nbool\nWhether this is preprocess step of the datasets.\nFalse\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nUnion['TrainDatasetMeta', None]\nThe dataset metadata loaded from all registered plugins.\n\n\n\n\n\n\n\nintegrations.base.PluginManager.on_rollouts_scored(\n cfg,\n trainer,\n prompts,\n completions,\n rewards,\n advantages,\n)\nCalls the on_rollouts_scored method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\n\nThe trainer instance.\nrequired\n\n\nprompts\nlist[str]\nList of prompt texts.\nrequired\n\n\ncompletions\nlist[str]\nList of completion texts.\nrequired\n\n\nrewards\ndict[str, list[float]]\nDict mapping reward function name to list of rewards.\nrequired\n\n\nadvantages\nlist[float]\nList of advantage values.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the\nmodel has been built / loaded, but before any adapters have been applied.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model\nhas been loaded inclusive of any adapters.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel | PeftModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_trainer_create(cfg, trainer)\nCalls the post_trainer_create method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\ntrainer\nTrainer\nThe trainer object for training.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\nmodel\nPreTrainedModel\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\nDictDefault\nThe configuration for the plugins.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\n\n\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\nplugin_name\nstr\nThe name of the plugin to be registered.\nrequired\n\n\n\n\n\n\n\n\n\nName\nType\nDescription\n\n\n\n\n\nImportError\nIf the plugin module cannot be imported." }, { "objectID": "docs/api/integrations.base.html#functions", diff --git a/sitemap.xml b/sitemap.xml index a715e1f3f..b77e704a6 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -2,942 +2,942 @@ https://docs.axolotl.ai/examples/colab-notebooks/colab-axolotl-example.html - 2026-03-25T15:16:53.882Z + 2026-03-25T15:20:49.130Z https://docs.axolotl.ai/src/axolotl/integrations/cut_cross_entropy/ACKNOWLEDGEMENTS.html - 2026-03-25T15:16:53.911Z + 2026-03-25T15:20:49.155Z https://docs.axolotl.ai/docs/inference.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/expert_quantization.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/installation.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/models/ministral3/think.html - 2026-03-25T15:20:46.509Z + 2026-03-25T15:24:50.288Z https://docs.axolotl.ai/docs/models/granite4.html - 2026-03-25T15:20:46.518Z + 2026-03-25T15:24:50.298Z https://docs.axolotl.ai/docs/models/seed-oss.html - 2026-03-25T15:20:46.517Z + 2026-03-25T15:24:50.296Z https://docs.axolotl.ai/docs/models/orpheus.html - 2026-03-25T15:20:46.519Z + 2026-03-25T15:24:50.299Z https://docs.axolotl.ai/docs/models/internvl3_5.html - 2026-03-25T15:20:46.507Z + 2026-03-25T15:24:50.286Z https://docs.axolotl.ai/docs/models/magistral/vision.html - 2026-03-25T15:20:46.511Z + 2026-03-25T15:24:50.291Z https://docs.axolotl.ai/docs/models/mimo.html - 2026-03-25T15:20:46.506Z + 2026-03-25T15:24:50.285Z https://docs.axolotl.ai/docs/models/gpt-oss.html - 2026-03-25T15:20:46.516Z + 2026-03-25T15:24:50.296Z https://docs.axolotl.ai/docs/models/qwen3-next.html - 2026-03-25T15:20:46.515Z + 2026-03-25T15:24:50.294Z https://docs.axolotl.ai/docs/models/llama-2.html - 2026-03-25T15:20:46.514Z + 2026-03-25T15:24:50.294Z https://docs.axolotl.ai/docs/models/kimi-linear.html - 2026-03-25T15:20:46.505Z + 2026-03-25T15:24:50.285Z https://docs.axolotl.ai/docs/models/smolvlm2.html - 2026-03-25T15:20:46.518Z + 2026-03-25T15:24:50.297Z https://docs.axolotl.ai/docs/models/olmo3.html - 2026-03-25T15:20:46.507Z + 2026-03-25T15:24:50.286Z https://docs.axolotl.ai/docs/models/jamba.html - 2026-03-25T15:20:46.519Z + 2026-03-25T15:24:50.299Z https://docs.axolotl.ai/docs/models/mistral-small.html - 2026-03-25T15:20:46.512Z + 2026-03-25T15:24:50.292Z https://docs.axolotl.ai/docs/models/devstral.html - 2026-03-25T15:20:46.513Z + 2026-03-25T15:24:50.293Z https://docs.axolotl.ai/docs/models/index.html - 2026-03-25T15:20:46.520Z + 2026-03-25T15:24:50.299Z https://docs.axolotl.ai/docs/lora_optims.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/cli.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/gradient_checkpointing.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset_preprocessing.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/docker.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/attention.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/api/prompt_strategies.input_output.html - 2026-03-25T15:20:20.851Z + 2026-03-25T15:24:25.214Z https://docs.axolotl.ai/docs/api/loaders.adapter.html - 2026-03-25T15:20:20.655Z + 2026-03-25T15:24:25.024Z https://docs.axolotl.ai/docs/api/monkeypatch.btlm_attn_hijack_flash.html - 2026-03-25T15:20:21.135Z + 2026-03-25T15:24:25.490Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chatml.html - 2026-03-25T15:20:20.919Z + 2026-03-25T15:24:25.281Z https://docs.axolotl.ai/docs/api/cli.inference.html - 2026-03-25T15:20:20.413Z + 2026-03-25T15:24:24.783Z https://docs.axolotl.ai/docs/api/core.trainers.mixins.optimizer.html - 2026-03-25T15:20:20.686Z + 2026-03-25T15:24:25.054Z https://docs.axolotl.ai/docs/api/kernels.swiglu.html - 2026-03-25T15:20:21.053Z + 2026-03-25T15:24:25.413Z https://docs.axolotl.ai/docs/api/utils.optimizers.adopt.html - 2026-03-25T15:20:21.353Z + 2026-03-25T15:24:25.703Z https://docs.axolotl.ai/docs/api/utils.callbacks.comet_.html - 2026-03-25T15:20:21.853Z + 2026-03-25T15:24:26.204Z https://docs.axolotl.ai/docs/api/utils.schemas.utils.html - 2026-03-25T15:20:21.512Z + 2026-03-25T15:24:25.858Z https://docs.axolotl.ai/docs/api/monkeypatch.data.batch_dataset_fetcher.html - 2026-03-25T15:20:21.169Z + 2026-03-25T15:24:25.523Z https://docs.axolotl.ai/docs/api/utils.model_shard_quant.html - 2026-03-25T15:20:21.234Z + 2026-03-25T15:24:25.585Z https://docs.axolotl.ai/docs/api/cli.merge_sharded_fsdp_weights.html - 2026-03-25T15:20:20.438Z + 2026-03-25T15:24:24.807Z https://docs.axolotl.ai/docs/api/cli.delinearize_llama4.html - 2026-03-25T15:20:20.395Z + 2026-03-25T15:24:24.765Z https://docs.axolotl.ai/docs/api/integrations.cut_cross_entropy.args.html - 2026-03-25T15:20:21.690Z + 2026-03-25T15:24:26.048Z https://docs.axolotl.ai/docs/api/utils.dict.html - 2026-03-25T15:20:21.343Z + 2026-03-25T15:24:25.693Z https://docs.axolotl.ai/docs/api/monkeypatch.multipack.html - 2026-03-25T15:20:21.084Z + 2026-03-25T15:24:25.442Z https://docs.axolotl.ai/docs/api/utils.schemas.config.html - 2026-03-25T15:20:21.409Z + 2026-03-25T15:24:25.756Z https://docs.axolotl.ai/docs/api/cli.cloud.base.html - 2026-03-25T15:20:20.468Z + 2026-03-25T15:24:24.845Z https://docs.axolotl.ai/docs/api/utils.collators.batching.html - 2026-03-25T15:20:21.766Z + 2026-03-25T15:24:26.120Z https://docs.axolotl.ai/docs/api/prompt_strategies.stepwise_supervised.html - 2026-03-25T15:20:20.857Z + 2026-03-25T15:24:25.220Z https://docs.axolotl.ai/docs/api/integrations.spectrum.args.html - 2026-03-25T15:20:21.715Z + 2026-03-25T15:24:26.072Z https://docs.axolotl.ai/docs/api/cli.config.html - 2026-03-25T15:20:20.388Z + 2026-03-25T15:24:24.759Z https://docs.axolotl.ai/docs/api/evaluate.html - 2026-03-25T15:20:20.082Z + 2026-03-25T15:24:24.461Z https://docs.axolotl.ai/docs/api/utils.schemas.training.html - 2026-03-25T15:20:21.427Z + 2026-03-25T15:24:25.773Z https://docs.axolotl.ai/docs/api/core.trainers.base.html - 2026-03-25T15:20:20.549Z + 2026-03-25T15:24:24.924Z https://docs.axolotl.ai/docs/api/cli.utils.args.html - 2026-03-25T15:20:20.494Z + 2026-03-25T15:24:24.870Z https://docs.axolotl.ai/docs/api/core.chat.format.chatml.html - 2026-03-25T15:20:20.242Z + 2026-03-25T15:24:24.617Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.passthrough.html - 2026-03-25T15:20:20.925Z + 2026-03-25T15:24:25.287Z https://docs.axolotl.ai/docs/api/prompt_strategies.messages.chat.html - 2026-03-25T15:20:20.884Z + 2026-03-25T15:24:25.247Z https://docs.axolotl.ai/docs/api/monkeypatch.relora.html - 2026-03-25T15:20:21.089Z + 2026-03-25T15:24:25.446Z https://docs.axolotl.ai/docs/api/utils.callbacks.qat.html - 2026-03-25T15:20:21.862Z + 2026-03-25T15:24:26.213Z https://docs.axolotl.ai/docs/api/cli.art.html - 2026-03-25T15:20:20.357Z + 2026-03-25T15:24:24.729Z https://docs.axolotl.ai/docs/api/integrations.grokfast.optimizer.html - 2026-03-25T15:20:21.692Z + 2026-03-25T15:24:26.050Z https://docs.axolotl.ai/docs/api/datasets.html - 2026-03-25T15:20:20.089Z + 2026-03-25T15:24:24.469Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.llama3.html - 2026-03-25T15:20:20.936Z + 2026-03-25T15:24:25.297Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.chatml.html - 2026-03-25T15:20:20.946Z + 2026-03-25T15:24:25.308Z https://docs.axolotl.ai/docs/api/utils.ctx_managers.sequence_parallel.html - 2026-03-25T15:20:20.729Z + 2026-03-25T15:24:25.096Z https://docs.axolotl.ai/docs/api/core.trainers.grpo.trainer.html - 2026-03-25T15:20:20.602Z + 2026-03-25T15:24:24.976Z https://docs.axolotl.ai/docs/api/utils.samplers.multipack.html - 2026-03-25T15:20:21.828Z + 2026-03-25T15:24:26.181Z https://docs.axolotl.ai/docs/api/core.trainers.dpo.trainer.html - 2026-03-25T15:20:20.584Z + 2026-03-25T15:24:24.958Z https://docs.axolotl.ai/docs/api/monkeypatch.mixtral.html - 2026-03-25T15:20:21.171Z + 2026-03-25T15:24:25.525Z https://docs.axolotl.ai/docs/api/utils.schemas.multimodal.html - 2026-03-25T15:20:21.474Z + 2026-03-25T15:24:25.820Z https://docs.axolotl.ai/docs/api/utils.lora.html - 2026-03-25T15:20:21.227Z + 2026-03-25T15:24:25.578Z https://docs.axolotl.ai/docs/api/core.chat.format.llama3x.html - 2026-03-25T15:20:20.243Z + 2026-03-25T15:24:24.619Z https://docs.axolotl.ai/docs/api/cli.train.html - 2026-03-25T15:20:20.316Z + 2026-03-25T15:24:24.689Z https://docs.axolotl.ai/docs/api/utils.trainer.html - 2026-03-25T15:20:21.270Z + 2026-03-25T15:24:25.622Z https://docs.axolotl.ai/docs/api/monkeypatch.mistral_attn_hijack_flash.html - 2026-03-25T15:20:21.082Z + 2026-03-25T15:24:25.440Z https://docs.axolotl.ai/docs/api/core.builders.base.html - 2026-03-25T15:20:20.181Z + 2026-03-25T15:24:24.558Z https://docs.axolotl.ai/docs/api/cli.utils.fetch.html - 2026-03-25T15:20:20.501Z + 2026-03-25T15:24:24.877Z https://docs.axolotl.ai/docs/api/utils.tokenization.html - 2026-03-25T15:20:21.218Z + 2026-03-25T15:24:25.570Z https://docs.axolotl.ai/docs/api/core.trainers.trl.html - 2026-03-25T15:20:20.568Z + 2026-03-25T15:24:24.942Z https://docs.axolotl.ai/docs/api/cli.checks.html - 2026-03-25T15:20:20.366Z + 2026-03-25T15:24:24.737Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.user_defined.html - 2026-03-25T15:20:20.948Z + 2026-03-25T15:24:25.310Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_flash.html - 2026-03-25T15:20:21.078Z + 2026-03-25T15:24:25.436Z https://docs.axolotl.ai/docs/api/prompt_strategies.orcamini.html - 2026-03-25T15:20:20.870Z + 2026-03-25T15:24:25.233Z https://docs.axolotl.ai/docs/api/monkeypatch.transformers_fa_utils.html - 2026-03-25T15:20:21.155Z + 2026-03-25T15:24:25.509Z https://docs.axolotl.ai/docs/api/kernels.lora.html - 2026-03-25T15:20:21.027Z + 2026-03-25T15:24:25.387Z https://docs.axolotl.ai/docs/api/utils.callbacks.profiler.html - 2026-03-25T15:20:21.841Z + 2026-03-25T15:24:26.193Z https://docs.axolotl.ai/docs/api/utils.callbacks.mlflow_.html - 2026-03-25T15:20:21.848Z + 2026-03-25T15:24:26.200Z https://docs.axolotl.ai/docs/api/utils.freeze.html - 2026-03-25T15:20:21.248Z + 2026-03-25T15:24:25.600Z https://docs.axolotl.ai/docs/api/integrations.kd.trainer.html - 2026-03-25T15:20:21.702Z + 2026-03-25T15:24:26.059Z https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_disk.html - 2026-03-25T15:20:21.209Z + 2026-03-25T15:24:25.562Z https://docs.axolotl.ai/docs/api/utils.data.streaming.html - 2026-03-25T15:20:21.354Z + 2026-03-25T15:24:25.705Z https://docs.axolotl.ai/docs/api/prompt_tokenizers.html - 2026-03-25T15:20:20.160Z + 2026-03-25T15:24:24.538Z https://docs.axolotl.ai/docs/api/core.trainers.mixins.rng_state_loader.html - 2026-03-25T15:20:20.691Z + 2026-03-25T15:24:25.058Z https://docs.axolotl.ai/docs/api/cli.cloud.modal_.html - 2026-03-25T15:20:20.477Z + 2026-03-25T15:24:24.854Z https://docs.axolotl.ai/docs/api/core.trainers.mixins.scheduler.html - 2026-03-25T15:20:20.699Z + 2026-03-25T15:24:25.067Z https://docs.axolotl.ai/docs/api/convert.html - 2026-03-25T15:20:20.106Z + 2026-03-25T15:24:24.486Z https://docs.axolotl.ai/docs/api/models.mamba.modeling_mamba.html - 2026-03-25T15:20:21.740Z + 2026-03-25T15:24:26.096Z https://docs.axolotl.ai/docs/api/cli.args.html - 2026-03-25T15:20:20.353Z + 2026-03-25T15:24:24.724Z https://docs.axolotl.ai/docs/api/core.chat.format.shared.html - 2026-03-25T15:20:20.245Z + 2026-03-25T15:24:24.620Z https://docs.axolotl.ai/docs/api/prompt_strategies.bradley_terry.llama3.html - 2026-03-25T15:20:20.979Z + 2026-03-25T15:24:25.340Z https://docs.axolotl.ai/docs/api/index.html - 2026-03-25T15:20:19.987Z + 2026-03-25T15:24:24.370Z https://docs.axolotl.ai/docs/fsdp_qlora.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset-formats/stepwise_supervised.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset-formats/template_free.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset-formats/index.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/telemetry.html - 2026-03-25T15:16:53.877Z + 2026-03-25T15:20:49.126Z https://docs.axolotl.ai/docs/config-reference.html - 2026-03-25T15:20:45.412Z + 2026-03-25T15:24:49.204Z https://docs.axolotl.ai/docs/ray-integration.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.125Z https://docs.axolotl.ai/docs/streaming.html - 2026-03-25T15:16:53.877Z + 2026-03-25T15:20:49.126Z https://docs.axolotl.ai/docs/sequence_parallelism.html - 2026-03-25T15:16:53.877Z + 2026-03-25T15:20:49.126Z https://docs.axolotl.ai/docs/unsloth.html - 2026-03-25T15:16:53.877Z + 2026-03-25T15:20:49.126Z https://docs.axolotl.ai/docs/mixed_precision.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/amd_hpc.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/lr_groups.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/optimizations.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/mac.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/index.html - 2026-03-25T15:16:53.903Z + 2026-03-25T15:20:49.147Z https://docs.axolotl.ai/docs/optimizers.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/getting-started.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/multi-node.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/input_output.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/nd_parallelism.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/dataset_loading.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/quantize.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.125Z https://docs.axolotl.ai/docs/rlhf.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.125Z https://docs.axolotl.ai/docs/custom_integrations.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/qat.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/checkpoint_saving.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/dataset-formats/conversation.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/dataset-formats/inst_tune.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset-formats/tokenized.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/dataset-formats/pretraining.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/api/cli.main.html - 2026-03-25T15:20:20.305Z + 2026-03-25T15:24:24.679Z https://docs.axolotl.ai/docs/api/utils.schemas.trl.html - 2026-03-25T15:20:21.467Z + 2026-03-25T15:24:25.813Z https://docs.axolotl.ai/docs/api/core.datasets.transforms.chat_builder.html - 2026-03-25T15:20:20.262Z + 2026-03-25T15:24:24.637Z https://docs.axolotl.ai/docs/api/common.const.html - 2026-03-25T15:20:21.719Z + 2026-03-25T15:24:26.076Z https://docs.axolotl.ai/docs/api/cli.utils.load.html - 2026-03-25T15:20:20.508Z + 2026-03-25T15:24:24.884Z https://docs.axolotl.ai/docs/api/loaders.patch_manager.html - 2026-03-25T15:20:20.677Z + 2026-03-25T15:24:25.045Z https://docs.axolotl.ai/docs/api/utils.quantization.html - 2026-03-25T15:20:21.388Z + 2026-03-25T15:24:25.737Z https://docs.axolotl.ai/docs/api/monkeypatch.utils.html - 2026-03-25T15:20:21.133Z + 2026-03-25T15:24:25.488Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.user_defined.html - 2026-03-25T15:20:20.923Z + 2026-03-25T15:24:25.285Z https://docs.axolotl.ai/docs/api/cli.quantize.html - 2026-03-25T15:20:20.455Z + 2026-03-25T15:24:24.825Z https://docs.axolotl.ai/docs/api/prompt_strategies.user_defined.html - 2026-03-25T15:20:20.819Z + 2026-03-25T15:24:25.183Z https://docs.axolotl.ai/docs/api/integrations.lm_eval.args.html - 2026-03-25T15:20:21.711Z + 2026-03-25T15:24:26.068Z https://docs.axolotl.ai/docs/api/monkeypatch.gradient_checkpointing.offload_cpu.html - 2026-03-25T15:20:21.175Z + 2026-03-25T15:24:25.529Z https://docs.axolotl.ai/docs/api/utils.schedulers.html - 2026-03-25T15:20:21.310Z + 2026-03-25T15:24:25.661Z https://docs.axolotl.ai/docs/api/kernels.geglu.html - 2026-03-25T15:20:21.041Z + 2026-03-25T15:24:25.400Z https://docs.axolotl.ai/docs/api/monkeypatch.trainer_fsdp_optim.html - 2026-03-25T15:20:21.147Z + 2026-03-25T15:24:25.501Z https://docs.axolotl.ai/docs/api/prompt_strategies.pygmalion.html - 2026-03-25T15:20:20.879Z + 2026-03-25T15:24:25.241Z https://docs.axolotl.ai/docs/api/common.architectures.html - 2026-03-25T15:20:21.717Z + 2026-03-25T15:24:26.074Z https://docs.axolotl.ai/docs/api/cli.utils.html - 2026-03-25T15:20:20.479Z + 2026-03-25T15:24:24.856Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_chat.html - 2026-03-25T15:20:20.791Z + 2026-03-25T15:24:25.156Z https://docs.axolotl.ai/docs/api/core.datasets.chat.html - 2026-03-25T15:20:20.252Z + 2026-03-25T15:24:24.627Z https://docs.axolotl.ai/docs/api/cli.evaluate.html - 2026-03-25T15:20:20.327Z + 2026-03-25T15:24:24.700Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_w_system.html - 2026-03-25T15:20:20.808Z + 2026-03-25T15:24:25.173Z https://docs.axolotl.ai/docs/api/prompt_strategies.orpo.chat_template.html - 2026-03-25T15:20:20.975Z + 2026-03-25T15:24:25.335Z https://docs.axolotl.ai/docs/api/utils.schemas.integrations.html - 2026-03-25T15:20:21.494Z + 2026-03-25T15:24:25.840Z https://docs.axolotl.ai/docs/api/utils.collators.mm_chat.html - 2026-03-25T15:20:21.776Z + 2026-03-25T15:24:26.131Z https://docs.axolotl.ai/docs/api/utils.callbacks.perplexity.html - 2026-03-25T15:20:21.837Z + 2026-03-25T15:24:26.188Z https://docs.axolotl.ai/docs/api/prompt_strategies.chat_template.html - 2026-03-25T15:20:20.773Z + 2026-03-25T15:24:25.139Z https://docs.axolotl.ai/docs/api/kernels.utils.html - 2026-03-25T15:20:21.071Z + 2026-03-25T15:24:25.429Z https://docs.axolotl.ai/docs/api/cli.vllm_serve.html - 2026-03-25T15:20:20.464Z + 2026-03-25T15:24:24.841Z https://docs.axolotl.ai/docs/api/core.trainers.mamba.html - 2026-03-25T15:20:20.575Z + 2026-03-25T15:24:24.949Z https://docs.axolotl.ai/docs/api/utils.bench.html - 2026-03-25T15:20:21.238Z + 2026-03-25T15:24:25.590Z https://docs.axolotl.ai/docs/api/cli.utils.sweeps.html - 2026-03-25T15:20:20.515Z + 2026-03-25T15:24:24.891Z https://docs.axolotl.ai/docs/api/cli.merge_lora.html - 2026-03-25T15:20:20.424Z + 2026-03-25T15:24:24.792Z https://docs.axolotl.ai/docs/api/loaders.model.html - 2026-03-25T15:20:20.631Z + 2026-03-25T15:24:25.004Z https://docs.axolotl.ai/docs/api/cli.preprocess.html - 2026-03-25T15:20:20.449Z + 2026-03-25T15:24:24.818Z https://docs.axolotl.ai/docs/api/utils.callbacks.lisa.html - 2026-03-25T15:20:21.844Z + 2026-03-25T15:24:26.195Z https://docs.axolotl.ai/docs/api/prompt_strategies.metharme.html - 2026-03-25T15:20:20.865Z + 2026-03-25T15:24:25.229Z https://docs.axolotl.ai/docs/api/utils.schemas.enums.html - 2026-03-25T15:20:21.505Z + 2026-03-25T15:24:25.851Z https://docs.axolotl.ai/docs/api/kernels.quantize.html - 2026-03-25T15:20:21.069Z + 2026-03-25T15:24:25.427Z https://docs.axolotl.ai/docs/api/utils.schemas.model.html - 2026-03-25T15:20:21.418Z + 2026-03-25T15:24:25.765Z https://docs.axolotl.ai/docs/api/utils.collators.core.html - 2026-03-25T15:20:21.742Z + 2026-03-25T15:24:26.097Z https://docs.axolotl.ai/docs/api/core.builders.rl.html - 2026-03-25T15:20:20.193Z + 2026-03-25T15:24:24.570Z https://docs.axolotl.ai/docs/api/core.builders.causal.html - 2026-03-25T15:20:20.187Z + 2026-03-25T15:24:24.563Z https://docs.axolotl.ai/docs/api/utils.distributed.html - 2026-03-25T15:20:21.336Z + 2026-03-25T15:24:25.686Z https://docs.axolotl.ai/docs/api/train.html - 2026-03-25T15:20:20.067Z + 2026-03-25T15:24:24.447Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chat_template.html - 2026-03-25T15:20:20.893Z + 2026-03-25T15:24:25.254Z https://docs.axolotl.ai/docs/api/integrations.base.html - 2026-03-25T15:20:21.686Z + 2026-03-25T15:24:26.044Z https://docs.axolotl.ai/docs/api/core.chat.messages.html - 2026-03-25T15:20:20.240Z + 2026-03-25T15:24:24.615Z https://docs.axolotl.ai/docs/api/core.trainers.grpo.sampler.html - 2026-03-25T15:20:20.617Z + 2026-03-25T15:24:24.990Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.llama3.html - 2026-03-25T15:20:20.906Z + 2026-03-25T15:24:25.268Z https://docs.axolotl.ai/docs/api/integrations.liger.args.html - 2026-03-25T15:20:21.706Z + 2026-03-25T15:24:26.064Z https://docs.axolotl.ai/docs/api/monkeypatch.unsloth_.html - 2026-03-25T15:20:21.157Z + 2026-03-25T15:24:25.511Z https://docs.axolotl.ai/docs/api/logging_config.html - 2026-03-25T15:20:20.172Z + 2026-03-25T15:24:24.550Z https://docs.axolotl.ai/docs/api/common.datasets.html - 2026-03-25T15:20:21.738Z + 2026-03-25T15:24:26.094Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_xformers.html - 2026-03-25T15:20:21.080Z + 2026-03-25T15:24:25.438Z https://docs.axolotl.ai/docs/api/prompt_strategies.llama2_chat.html - 2026-03-25T15:20:20.835Z + 2026-03-25T15:24:25.199Z https://docs.axolotl.ai/docs/api/utils.schemas.datasets.html - 2026-03-25T15:20:21.452Z + 2026-03-25T15:24:25.798Z https://docs.axolotl.ai/docs/api/monkeypatch.stablelm_attn_hijack_flash.html - 2026-03-25T15:20:21.143Z + 2026-03-25T15:24:25.497Z https://docs.axolotl.ai/docs/api/cli.utils.train.html - 2026-03-25T15:20:20.530Z + 2026-03-25T15:24:24.906Z https://docs.axolotl.ai/docs/api/loaders.constants.html - 2026-03-25T15:20:20.679Z + 2026-03-25T15:24:25.046Z https://docs.axolotl.ai/docs/api/prompt_strategies.completion.html - 2026-03-25T15:20:20.843Z + 2026-03-25T15:24:25.207Z https://docs.axolotl.ai/docs/api/loaders.tokenizer.html - 2026-03-25T15:20:20.643Z + 2026-03-25T15:24:25.015Z https://docs.axolotl.ai/docs/api/core.training_args.html - 2026-03-25T15:20:20.209Z + 2026-03-25T15:24:24.586Z https://docs.axolotl.ai/docs/api/loaders.processor.html - 2026-03-25T15:20:20.648Z + 2026-03-25T15:24:25.017Z https://docs.axolotl.ai/docs/api/prompt_strategies.base.html - 2026-03-25T15:20:20.731Z + 2026-03-25T15:24:25.098Z https://docs.axolotl.ai/docs/api/utils.collators.mamba.html - 2026-03-25T15:20:21.770Z + 2026-03-25T15:24:26.125Z https://docs.axolotl.ai/docs/api/monkeypatch.lora_kernels.html - 2026-03-25T15:20:21.125Z + 2026-03-25T15:24:25.481Z https://docs.axolotl.ai/docs/api/utils.schemas.peft.html - 2026-03-25T15:20:21.463Z + 2026-03-25T15:24:25.809Z https://docs.axolotl.ai/docs/api/utils.data.sft.html - 2026-03-25T15:20:21.363Z + 2026-03-25T15:24:25.712Z https://docs.axolotl.ai/docs/api/core.trainers.utils.html - 2026-03-25T15:20:20.619Z + 2026-03-25T15:24:24.992Z https://docs.axolotl.ai/docs/api/utils.chat_templates.html - 2026-03-25T15:20:21.220Z + 2026-03-25T15:24:25.572Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_instruct.html - 2026-03-25T15:20:20.793Z + 2026-03-25T15:24:25.158Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.zephyr.html - 2026-03-25T15:20:20.921Z + 2026-03-25T15:24:25.283Z https://docs.axolotl.ai/docs/multipack.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/torchao.html - 2026-03-25T15:16:53.877Z + 2026-03-25T15:20:49.126Z https://docs.axolotl.ai/docs/reward_modelling.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.125Z https://docs.axolotl.ai/docs/nccl.html - 2026-03-25T15:16:53.876Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/multi-gpu.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/batch_vs_grad.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.121Z https://docs.axolotl.ai/docs/multimodal.html - 2026-03-25T15:16:53.875Z + 2026-03-25T15:20:49.124Z https://docs.axolotl.ai/docs/models/LiquidAI.html - 2026-03-25T15:20:46.518Z + 2026-03-25T15:24:50.298Z https://docs.axolotl.ai/docs/models/mistral.html - 2026-03-25T15:20:46.513Z + 2026-03-25T15:24:50.293Z https://docs.axolotl.ai/docs/models/trinity.html - 2026-03-25T15:20:46.508Z + 2026-03-25T15:24:50.287Z https://docs.axolotl.ai/docs/models/hunyuan.html - 2026-03-25T15:20:46.519Z + 2026-03-25T15:24:50.298Z https://docs.axolotl.ai/docs/models/phi.html - 2026-03-25T15:20:46.517Z + 2026-03-25T15:24:50.297Z https://docs.axolotl.ai/docs/models/apertus.html - 2026-03-25T15:20:46.516Z + 2026-03-25T15:24:50.295Z https://docs.axolotl.ai/docs/models/plano.html - 2026-03-25T15:20:46.506Z + 2026-03-25T15:24:50.285Z https://docs.axolotl.ai/docs/models/gemma3n.html - 2026-03-25T15:20:46.516Z + 2026-03-25T15:24:50.295Z https://docs.axolotl.ai/docs/models/arcee.html - 2026-03-25T15:20:46.508Z + 2026-03-25T15:24:50.287Z https://docs.axolotl.ai/docs/models/ministral3.html - 2026-03-25T15:20:46.509Z + 2026-03-25T15:24:50.288Z https://docs.axolotl.ai/docs/models/magistral/think.html - 2026-03-25T15:20:46.511Z + 2026-03-25T15:24:50.290Z https://docs.axolotl.ai/docs/models/llama-4.html - 2026-03-25T15:20:46.514Z + 2026-03-25T15:24:50.293Z https://docs.axolotl.ai/docs/models/voxtral.html - 2026-03-25T15:20:46.512Z + 2026-03-25T15:24:50.292Z https://docs.axolotl.ai/docs/models/magistral.html - 2026-03-25T15:20:46.511Z + 2026-03-25T15:24:50.290Z https://docs.axolotl.ai/docs/models/qwen3.html - 2026-03-25T15:20:46.515Z + 2026-03-25T15:24:50.294Z https://docs.axolotl.ai/docs/models/ministral.html - 2026-03-25T15:20:46.512Z + 2026-03-25T15:24:50.291Z https://docs.axolotl.ai/docs/models/ministral3/vision.html - 2026-03-25T15:20:46.509Z + 2026-03-25T15:24:50.289Z https://docs.axolotl.ai/docs/debugging.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/docs/faq.html - 2026-03-25T15:16:53.872Z + 2026-03-25T15:20:49.122Z https://docs.axolotl.ai/src/axolotl/integrations/LICENSE.html - 2026-03-25T15:16:53.910Z + 2026-03-25T15:20:49.154Z https://docs.axolotl.ai/FAQS.html - 2026-03-25T15:16:53.869Z + 2026-03-25T15:20:49.119Z