diff --git a/.nojekyll b/.nojekyll index 00c5817ec..dda6da96f 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -a8f47fa8 \ No newline at end of file +5d0d8a8a \ No newline at end of file diff --git a/docs/api/integrations.base.html b/docs/api/integrations.base.html index c3069280f..9a7b70e82 100644 --- a/docs/api/integrations.base.html +++ b/docs/api/integrations.base.html @@ -518,7 +518,7 @@ pre_lora_load(cfg, model): Performs actions before LoRA weights are loaded. post_lora_load(cfg, model): Performs actions after LoRA weights are loaded. post_model_load(cfg, model): Performs actions after the model is loaded, inclusive of any adapters. create_optimizer(cfg, trainer): Creates and returns an optimizer for training. -create_lr_scheduler(cfg, trainer, optimizer): Creates and returns a learning rate scheduler. +create_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and returns a learning rate scheduler. add_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before training. add_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after training.

@@ -612,14 +612,20 @@ List[callable]: A list of callback functions to be added to the TrainingArgs

create_lr_scheduler
-
integrations.base.BasePlugin.create_lr_scheduler(cfg, trainer, optimizer)
+
integrations.base.BasePlugin.create_lr_scheduler(
+    cfg,
+    trainer,
+    optimizer,
+    num_training_steps,
+)

Creates and returns a learning rate scheduler.

Parameters: cfg (dict): The configuration for the plugin. trainer (object): The trainer object for training. -optimizer (object): The optimizer for training.

+optimizer (object): The optimizer for training. +num_training_steps (int): Total number of training steps

Returns: -object: The created learning rate scheduler.

+object (LRScheduler): The created learning rate scheduler.

create_optimizer
@@ -845,7 +851,11 @@ List[callable]: A list of callback functions to be added to the TrainingArgs.

create_lr_scheduler
-
integrations.base.PluginManager.create_lr_scheduler(trainer, optimizer)
+
integrations.base.PluginManager.create_lr_scheduler(
+    trainer,
+    optimizer,
+    num_training_steps,
+)

Calls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.

Parameters: trainer (object): The trainer object for training. diff --git a/search.json b/search.json index a51f0521a..7602408e7 100644 --- a/search.json +++ b/search.json @@ -2528,14 +2528,14 @@ "href": "docs/api/integrations.base.html", "title": "integrations.base", "section": "", - "text": "integrations.base\nBase class for all plugins.\nA plugin is a reusable, modular, and self-contained piece of code that extends the functionality of Axolotl.\nPlugins can be used to integrate third-party models, modify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and implement the required methods.\n\n\n\n\n\nName\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\nintegrations.base.BasePlugin(self)\nBase class for all plugins. Defines the interface for plugin methods.\nAttributes:\nNone\nMethods:\nregister(cfg): Registers the plugin with the given configuration.\npre_model_load(cfg): Performs actions before the model is loaded.\npost_model_build(cfg, model): Performs actions after the model is loaded, but before LoRA adapters are applied.\npre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\npost_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\npost_model_load(cfg, model): Performs actions after the model is loaded, inclusive of any adapters.\ncreate_optimizer(cfg, trainer): Creates and returns an optimizer for training.\ncreate_lr_scheduler(cfg, trainer, optimizer): Creates and returns a learning rate scheduler.\nadd_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before training.\nadd_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after training.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer.\n\n\nadd_callbacks_pre_trainer\nsetup callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer.\nThis is useful for callbacks that require access to the model or trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nsetup callbacks before creating the trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(cfg, trainer, optimizer)\nCreates and returns a learning rate scheduler.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler.\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer.\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\nParameters:\ncfg (dict): The global axolotl configuration.\nReturns:\nclass: The class for the trainer.\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\nParameters:\ncfg (dict): The axolotl configuration\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins.\nIt should be a singleton so it can be accessed from anywhere in the codebase.\nAttributes:\nplugins (ListBasePlugin): A list of loaded plugins.\nMethods:\nget_instance(): Static method to get the singleton instance of PluginManager.\nregister(plugin_name: str): Registers a new plugin by its name.\npre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager.\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model has been loaded\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(trainer, optimizer)\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\nParameters:\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler, or None if none was found.\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\nParameters:\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\nReturns:\nlist[str]: A list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager.\nIf the instance doesn’t exist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The trainer class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\nbut before any adapters have been applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugins.\nrequired\n\n\nmodel\nobject\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model has been loaded\ninclusive of any adapters\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\nParameters:\nplugin_name (str): The name of the plugin to be registered.\nReturns:\nNone\nRaises:\nImportError: If the plugin module cannot be imported.\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nload_plugin\nLoads a plugin based on the given plugin name.\n\n\n\n\n\nintegrations.base.load_plugin(plugin_name)\nLoads a plugin based on the given plugin name.\nThe plugin name should be in the format “module_name.class_name”.\nThis function splits the plugin name into module and class, imports the module,\nretrieves the class from the module, and creates an instance of the class.\nParameters:\nplugin_name (str): The name of the plugin to be loaded. The name should be in the format “module_name.class_name”.\nReturns:\nBasePlugin: An instance of the loaded plugin.\nRaises:\nImportError: If the plugin module cannot be imported." + "text": "integrations.base\nBase class for all plugins.\nA plugin is a reusable, modular, and self-contained piece of code that extends the functionality of Axolotl.\nPlugins can be used to integrate third-party models, modify the training process, or add new features.\nTo create a new plugin, you need to inherit from the BasePlugin class and implement the required methods.\n\n\n\n\n\nName\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\nintegrations.base.BasePlugin(self)\nBase class for all plugins. Defines the interface for plugin methods.\nAttributes:\nNone\nMethods:\nregister(cfg): Registers the plugin with the given configuration.\npre_model_load(cfg): Performs actions before the model is loaded.\npost_model_build(cfg, model): Performs actions after the model is loaded, but before LoRA adapters are applied.\npre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\npost_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\npost_model_load(cfg, model): Performs actions after the model is loaded, inclusive of any adapters.\ncreate_optimizer(cfg, trainer): Creates and returns an optimizer for training.\ncreate_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and returns a learning rate scheduler.\nadd_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before training.\nadd_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after training.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer.\n\n\nadd_callbacks_pre_trainer\nsetup callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer.\nThis is useful for callbacks that require access to the model or trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nsetup callbacks before creating the trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nnum_training_steps (int): Total number of training steps\nReturns:\nobject (LRScheduler): The created learning rate scheduler.\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer.\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\nParameters:\ncfg (dict): The global axolotl configuration.\nReturns:\nclass: The class for the trainer.\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\nParameters:\ncfg (dict): The axolotl configuration\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins.\nIt should be a singleton so it can be accessed from anywhere in the codebase.\nAttributes:\nplugins (ListBasePlugin): A list of loaded plugins.\nMethods:\nget_instance(): Static method to get the singleton instance of PluginManager.\nregister(plugin_name: str): Registers a new plugin by its name.\npre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager.\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model has been loaded\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\nParameters:\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler, or None if none was found.\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\nParameters:\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\nReturns:\nlist[str]: A list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager.\nIf the instance doesn’t exist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The trainer class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\nbut before any adapters have been applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugins.\nrequired\n\n\nmodel\nobject\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model has been loaded\ninclusive of any adapters\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\nParameters:\nplugin_name (str): The name of the plugin to be registered.\nReturns:\nNone\nRaises:\nImportError: If the plugin module cannot be imported.\n\n\n\n\n\n\n\n\n\nName\nDescription\n\n\n\n\nload_plugin\nLoads a plugin based on the given plugin name.\n\n\n\n\n\nintegrations.base.load_plugin(plugin_name)\nLoads a plugin based on the given plugin name.\nThe plugin name should be in the format “module_name.class_name”.\nThis function splits the plugin name into module and class, imports the module,\nretrieves the class from the module, and creates an instance of the class.\nParameters:\nplugin_name (str): The name of the plugin to be loaded. The name should be in the format “module_name.class_name”.\nReturns:\nBasePlugin: An instance of the loaded plugin.\nRaises:\nImportError: If the plugin module cannot be imported." }, { "objectID": "docs/api/integrations.base.html#classes", "href": "docs/api/integrations.base.html#classes", "title": "integrations.base", "section": "", - "text": "Name\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\nintegrations.base.BasePlugin(self)\nBase class for all plugins. Defines the interface for plugin methods.\nAttributes:\nNone\nMethods:\nregister(cfg): Registers the plugin with the given configuration.\npre_model_load(cfg): Performs actions before the model is loaded.\npost_model_build(cfg, model): Performs actions after the model is loaded, but before LoRA adapters are applied.\npre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\npost_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\npost_model_load(cfg, model): Performs actions after the model is loaded, inclusive of any adapters.\ncreate_optimizer(cfg, trainer): Creates and returns an optimizer for training.\ncreate_lr_scheduler(cfg, trainer, optimizer): Creates and returns a learning rate scheduler.\nadd_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before training.\nadd_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after training.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer.\n\n\nadd_callbacks_pre_trainer\nsetup callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer.\nThis is useful for callbacks that require access to the model or trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nsetup callbacks before creating the trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(cfg, trainer, optimizer)\nCreates and returns a learning rate scheduler.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler.\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer.\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\nParameters:\ncfg (dict): The global axolotl configuration.\nReturns:\nclass: The class for the trainer.\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\nParameters:\ncfg (dict): The axolotl configuration\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins.\nIt should be a singleton so it can be accessed from anywhere in the codebase.\nAttributes:\nplugins (ListBasePlugin): A list of loaded plugins.\nMethods:\nget_instance(): Static method to get the singleton instance of PluginManager.\nregister(plugin_name: str): Registers a new plugin by its name.\npre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager.\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model has been loaded\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(trainer, optimizer)\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\nParameters:\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler, or None if none was found.\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\nParameters:\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\nReturns:\nlist[str]: A list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager.\nIf the instance doesn’t exist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The trainer class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\nbut before any adapters have been applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugins.\nrequired\n\n\nmodel\nobject\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model has been loaded\ninclusive of any adapters\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\nParameters:\nplugin_name (str): The name of the plugin to be registered.\nReturns:\nNone\nRaises:\nImportError: If the plugin module cannot be imported." + "text": "Name\nDescription\n\n\n\n\nBaseOptimizerFactory\nBase class for factories to create custom optimizers\n\n\nBasePlugin\nBase class for all plugins. Defines the interface for plugin methods.\n\n\nPluginManager\nThe PluginManager class is responsible for loading and managing plugins.\n\n\n\n\n\nintegrations.base.BaseOptimizerFactory()\nBase class for factories to create custom optimizers\n\n\n\nintegrations.base.BasePlugin(self)\nBase class for all plugins. Defines the interface for plugin methods.\nAttributes:\nNone\nMethods:\nregister(cfg): Registers the plugin with the given configuration.\npre_model_load(cfg): Performs actions before the model is loaded.\npost_model_build(cfg, model): Performs actions after the model is loaded, but before LoRA adapters are applied.\npre_lora_load(cfg, model): Performs actions before LoRA weights are loaded.\npost_lora_load(cfg, model): Performs actions after LoRA weights are loaded.\npost_model_load(cfg, model): Performs actions after the model is loaded, inclusive of any adapters.\ncreate_optimizer(cfg, trainer): Creates and returns an optimizer for training.\ncreate_lr_scheduler(cfg, trainer, optimizer, num_training_steps): Creates and returns a learning rate scheduler.\nadd_callbacks_pre_trainer(cfg, model): Adds callbacks to the trainer before training.\nadd_callbacks_post_trainer(cfg, trainer): Adds callbacks to the trainer after training.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nAdds callbacks to the trainer after creating the trainer.\n\n\nadd_callbacks_pre_trainer\nsetup callbacks before creating the trainer.\n\n\ncreate_lr_scheduler\nCreates and returns a learning rate scheduler.\n\n\ncreate_optimizer\nCreates and returns an optimizer for training.\n\n\nget_input_args\nReturns a pydantic model for the plugin’s input arguments.\n\n\nget_trainer_cls\nReturns a custom class for the trainer.\n\n\npost_lora_load\nPerforms actions after LoRA weights are loaded.\n\n\npost_model_build\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\npost_model_load\nPerforms actions after the model is loaded.\n\n\npost_train\nPerforms actions after training is complete.\n\n\npost_train_unload\nPerforms actions after training is complete and the model is unloaded.\n\n\npre_lora_load\nPerforms actions before LoRA weights are loaded.\n\n\npre_model_load\nPerforms actions before the model is loaded.\n\n\nregister\nRegisters the plugin with the given configuration.\n\n\n\n\n\nintegrations.base.BasePlugin.add_callbacks_post_trainer(cfg, trainer)\nAdds callbacks to the trainer after creating the trainer.\nThis is useful for callbacks that require access to the model or trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added\n\n\n\nintegrations.base.BasePlugin.add_callbacks_pre_trainer(cfg, model)\nsetup callbacks before creating the trainer.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs\n\n\n\nintegrations.base.BasePlugin.create_lr_scheduler(\n cfg,\n trainer,\n optimizer,\n num_training_steps,\n)\nCreates and returns a learning rate scheduler.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nnum_training_steps (int): Total number of training steps\nReturns:\nobject (LRScheduler): The created learning rate scheduler.\n\n\n\nintegrations.base.BasePlugin.create_optimizer(cfg, trainer)\nCreates and returns an optimizer for training.\nParameters:\ncfg (dict): The configuration for the plugin.\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer.\n\n\n\nintegrations.base.BasePlugin.get_input_args()\nReturns a pydantic model for the plugin’s input arguments.\n\n\n\nintegrations.base.BasePlugin.get_trainer_cls(cfg)\nReturns a custom class for the trainer.\nParameters:\ncfg (dict): The global axolotl configuration.\nReturns:\nclass: The class for the trainer.\n\n\n\nintegrations.base.BasePlugin.post_lora_load(cfg, model)\nPerforms actions after LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_model_build(cfg, model)\nPerforms actions after the model is built/loaded, but before any adapters are applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugin.\nrequired\n\n\n\n\n\n\n\nintegrations.base.BasePlugin.post_model_load(cfg, model)\nPerforms actions after the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train(cfg, model)\nPerforms actions after training is complete.\nParameters:\ncfg (dict): The axolotl configuration\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.post_train_unload(cfg)\nPerforms actions after training is complete and the model is unloaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_lora_load(cfg, model)\nPerforms actions before LoRA weights are loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.pre_model_load(cfg)\nPerforms actions before the model is loaded.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\nintegrations.base.BasePlugin.register(cfg)\nRegisters the plugin with the given configuration.\nParameters:\ncfg (dict): The configuration for the plugin.\nReturns:\nNone\n\n\n\n\n\nintegrations.base.PluginManager()\nThe PluginManager class is responsible for loading and managing plugins.\nIt should be a singleton so it can be accessed from anywhere in the codebase.\nAttributes:\nplugins (ListBasePlugin): A list of loaded plugins.\nMethods:\nget_instance(): Static method to get the singleton instance of PluginManager.\nregister(plugin_name: str): Registers a new plugin by its name.\npre_model_load(cfg): Calls the pre_model_load method of all registered plugins.\n\n\n\n\n\nName\nDescription\n\n\n\n\nadd_callbacks_post_trainer\nCalls the add_callbacks_post_trainer method of all registered plugins.\n\n\nadd_callbacks_pre_trainer\nCalls the add_callbacks_pre_trainer method of all registered plugins.\n\n\ncreate_lr_scheduler\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\n\n\ncreate_optimizer\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\n\n\nget_input_args\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\n\n\nget_instance\nReturns the singleton instance of PluginManager.\n\n\nget_trainer_cls\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\n\n\npost_lora_load\nCalls the post_lora_load method of all registered plugins.\n\n\npost_model_build\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\n\n\npost_model_load\nCalls the post_model_load method of all registered plugins after the model has been loaded\n\n\npost_train\nCalls the post_train method of all registered plugins.\n\n\npost_train_unload\nCalls the post_train_unload method of all registered plugins.\n\n\npre_lora_load\nCalls the pre_lora_load method of all registered plugins.\n\n\npre_model_load\nCalls the pre_model_load method of all registered plugins.\n\n\nregister\nRegisters a new plugin by its name.\n\n\n\n\n\nintegrations.base.PluginManager.add_callbacks_post_trainer(cfg, trainer)\nCalls the add_callbacks_post_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\ntrainer (object): The trainer object for training.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.add_callbacks_pre_trainer(cfg, model)\nCalls the add_callbacks_pre_trainer method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nList[callable]: A list of callback functions to be added to the TrainingArgs.\n\n\n\nintegrations.base.PluginManager.create_lr_scheduler(\n trainer,\n optimizer,\n num_training_steps,\n)\nCalls the create_lr_scheduler method of all registered plugins and returns the first non-None scheduler.\nParameters:\ntrainer (object): The trainer object for training.\noptimizer (object): The optimizer for training.\nReturns:\nobject: The created learning rate scheduler, or None if none was found.\n\n\n\nintegrations.base.PluginManager.create_optimizer(trainer)\nCalls the create_optimizer method of all registered plugins and returns the first non-None optimizer.\nParameters:\ntrainer (object): The trainer object for training.\nReturns:\nobject: The created optimizer, or None if none was found.\n\n\n\nintegrations.base.PluginManager.get_input_args()\nReturns a list of Pydantic classes for all registered plugins’ input arguments.’\nReturns:\nlist[str]: A list of Pydantic classes for all registered plugins’ input arguments.’\n\n\n\nintegrations.base.PluginManager.get_instance()\nReturns the singleton instance of PluginManager.\nIf the instance doesn’t exist, it creates a new one.\n\n\n\nintegrations.base.PluginManager.get_trainer_cls(cfg)\nCalls the get_trainer_cls method of all registered plugins and returns the first non-None trainer class.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nobject: The trainer class, or None if none was found.\n\n\n\nintegrations.base.PluginManager.post_lora_load(cfg, model)\nCalls the post_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_model_build(cfg, model)\nCalls the post_model_build method of all registered plugins after the model has been built/loaded,\nbut before any adapters have been applied.\n\n\n\n\n\nName\nType\nDescription\nDefault\n\n\n\n\ncfg\ndict\nThe configuration for the plugins.\nrequired\n\n\nmodel\nobject\nThe loaded model.\nrequired\n\n\n\n\n\n\n\nintegrations.base.PluginManager.post_model_load(cfg, model)\nCalls the post_model_load method of all registered plugins after the model has been loaded\ninclusive of any adapters\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train(cfg, model)\nCalls the post_train method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.post_train_unload(cfg)\nCalls the post_train_unload method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_lora_load(cfg, model)\nCalls the pre_lora_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nmodel (object): The loaded model.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.pre_model_load(cfg)\nCalls the pre_model_load method of all registered plugins.\nParameters:\ncfg (dict): The configuration for the plugins.\nReturns:\nNone\n\n\n\nintegrations.base.PluginManager.register(plugin_name)\nRegisters a new plugin by its name.\nParameters:\nplugin_name (str): The name of the plugin to be registered.\nReturns:\nNone\nRaises:\nImportError: If the plugin module cannot be imported." }, { "objectID": "docs/api/integrations.base.html#functions", diff --git a/sitemap.xml b/sitemap.xml index 990160e7a..9f987b894 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -2,682 +2,682 @@ https://docs.axolotl.ai/examples/colab-notebooks/colab-axolotl-example.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.847Z https://docs.axolotl.ai/index.html - 2025-04-29T20:19:00.618Z + 2025-04-29T21:08:46.859Z https://docs.axolotl.ai/docs/rlhf.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/unsloth.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/dataset_preprocessing.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/input_output.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/dataset_loading.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/api/utils.collators.mamba.html - 2025-04-29T20:19:48.200Z + 2025-04-29T21:09:29.712Z https://docs.axolotl.ai/docs/api/utils.optimizers.adopt.html - 2025-04-29T20:19:47.977Z + 2025-04-29T21:09:29.484Z https://docs.axolotl.ai/docs/api/prompt_strategies.user_defined.html - 2025-04-29T20:19:47.488Z + 2025-04-29T21:09:28.988Z https://docs.axolotl.ai/docs/api/utils.chat_templates.html - 2025-04-29T20:19:47.879Z + 2025-04-29T21:09:29.385Z https://docs.axolotl.ai/docs/api/cli.merge_lora.html - 2025-04-29T20:19:47.317Z + 2025-04-29T21:09:28.814Z https://docs.axolotl.ai/docs/api/monkeypatch.multipack.html - 2025-04-29T20:19:47.761Z + 2025-04-29T21:09:29.266Z https://docs.axolotl.ai/docs/api/core.chat.format.shared.html - 2025-04-29T20:19:47.194Z + 2025-04-29T21:09:28.686Z https://docs.axolotl.ai/docs/api/utils.schemas.integrations.html - 2025-04-29T20:19:48.049Z + 2025-04-29T21:09:29.559Z https://docs.axolotl.ai/docs/api/utils.freeze.html - 2025-04-29T20:19:47.904Z + 2025-04-29T21:09:29.410Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_w_system.html - 2025-04-29T20:19:47.480Z + 2025-04-29T21:09:28.979Z https://docs.axolotl.ai/docs/api/monkeypatch.attention.mllama.html - 2025-04-29T20:19:47.830Z + 2025-04-29T21:09:29.335Z https://docs.axolotl.ai/docs/api/utils.schemas.model.html - 2025-04-29T20:19:47.998Z + 2025-04-29T21:09:29.506Z https://docs.axolotl.ai/docs/api/core.datasets.transforms.chat_builder.html - 2025-04-29T20:19:47.207Z + 2025-04-29T21:09:28.700Z https://docs.axolotl.ai/docs/api/monkeypatch.btlm_attn_hijack_flash.html - 2025-04-29T20:19:47.805Z + 2025-04-29T21:09:29.310Z https://docs.axolotl.ai/docs/api/models.mamba.modeling_mamba.html - 2025-04-29T20:19:48.176Z + 2025-04-29T21:09:29.687Z https://docs.axolotl.ai/docs/api/core.datasets.chat.html - 2025-04-29T20:19:47.199Z + 2025-04-29T21:09:28.692Z https://docs.axolotl.ai/docs/api/utils.model_shard_quant.html - 2025-04-29T20:19:47.893Z + 2025-04-29T21:09:29.399Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_patch_multipack.html - 2025-04-29T20:19:47.806Z + 2025-04-29T21:09:29.311Z https://docs.axolotl.ai/docs/api/utils.lora_embeddings.html - 2025-04-29T20:19:47.887Z + 2025-04-29T21:09:29.393Z https://docs.axolotl.ai/docs/api/utils.schemas.peft.html - 2025-04-29T20:19:48.029Z + 2025-04-29T21:09:29.538Z https://docs.axolotl.ai/docs/api/core.chat.format.llama3x.html - 2025-04-29T20:19:47.193Z + 2025-04-29T21:09:28.685Z https://docs.axolotl.ai/docs/api/monkeypatch.mistral_attn_hijack_flash.html - 2025-04-29T20:19:47.759Z + 2025-04-29T21:09:29.264Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chat_template.html - 2025-04-29T20:19:47.541Z + 2025-04-29T21:09:29.041Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.user_defined.html - 2025-04-29T20:19:47.564Z + 2025-04-29T21:09:29.064Z https://docs.axolotl.ai/docs/api/utils.distributed.html - 2025-04-29T20:19:47.966Z + 2025-04-29T21:09:29.473Z https://docs.axolotl.ai/docs/api/prompt_strategies.pygmalion.html - 2025-04-29T20:19:47.535Z + 2025-04-29T21:09:29.035Z https://docs.axolotl.ai/docs/api/utils.dict.html - 2025-04-29T20:19:47.969Z + 2025-04-29T21:09:29.477Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_flash.html - 2025-04-29T20:19:47.744Z + 2025-04-29T21:09:29.248Z https://docs.axolotl.ai/docs/api/prompt_strategies.base.html - 2025-04-29T20:19:47.435Z + 2025-04-29T21:09:28.934Z https://docs.axolotl.ai/docs/api/monkeypatch.data.batch_dataset_fetcher.html - 2025-04-29T20:19:47.831Z + 2025-04-29T21:09:29.337Z https://docs.axolotl.ai/docs/api/kernels.swiglu.html - 2025-04-29T20:19:47.710Z + 2025-04-29T21:09:29.212Z https://docs.axolotl.ai/docs/api/core.chat.messages.html - 2025-04-29T20:19:47.190Z + 2025-04-29T21:09:28.682Z https://docs.axolotl.ai/docs/api/index.html - 2025-04-29T20:19:46.925Z + 2025-04-29T21:09:28.414Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.zephyr.html - 2025-04-29T20:19:47.563Z + 2025-04-29T21:09:29.063Z https://docs.axolotl.ai/docs/api/convert.html - 2025-04-29T20:19:47.017Z + 2025-04-29T21:09:28.507Z https://docs.axolotl.ai/docs/api/utils.schemas.multimodal.html - 2025-04-29T20:19:48.037Z + 2025-04-29T21:09:29.546Z https://docs.axolotl.ai/docs/api/core.trainers.base.html - 2025-04-29T20:19:47.403Z + 2025-04-29T21:09:28.901Z https://docs.axolotl.ai/docs/api/evaluate.html - 2025-04-29T20:19:46.996Z + 2025-04-29T21:09:28.486Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_attn_hijack_xformers.html - 2025-04-29T20:19:47.745Z + 2025-04-29T21:09:29.250Z https://docs.axolotl.ai/docs/api/kernels.quantize.html - 2025-04-29T20:19:47.717Z + 2025-04-29T21:09:29.220Z https://docs.axolotl.ai/docs/api/utils.callbacks.mlflow_.html - 2025-04-29T20:19:48.228Z + 2025-04-29T21:09:29.748Z https://docs.axolotl.ai/docs/api/utils.callbacks.profiler.html - 2025-04-29T20:19:48.223Z + 2025-04-29T21:09:29.739Z https://docs.axolotl.ai/docs/api/core.trainers.dpo.trainer.html - 2025-04-29T20:19:47.430Z + 2025-04-29T21:09:28.929Z https://docs.axolotl.ai/docs/api/cli.vllm_serve.html - 2025-04-29T20:19:47.380Z + 2025-04-29T21:09:28.877Z https://docs.axolotl.ai/docs/api/train.html - 2025-04-29T20:19:46.985Z + 2025-04-29T21:09:28.476Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.chatml.html - 2025-04-29T20:19:47.561Z + 2025-04-29T21:09:29.061Z https://docs.axolotl.ai/docs/api/utils.schemas.trl.html - 2025-04-29T20:19:48.032Z + 2025-04-29T21:09:29.541Z https://docs.axolotl.ai/docs/api/kernels.geglu.html - 2025-04-29T20:19:47.700Z + 2025-04-29T21:09:29.202Z https://docs.axolotl.ai/docs/api/utils.bench.html - 2025-04-29T20:19:47.896Z + 2025-04-29T21:09:29.402Z https://docs.axolotl.ai/docs/api/monkeypatch.transformers_fa_utils.html - 2025-04-29T20:19:47.822Z + 2025-04-29T21:09:29.327Z https://docs.axolotl.ai/docs/api/integrations.liger.args.html - 2025-04-29T20:19:48.149Z + 2025-04-29T21:09:29.660Z https://docs.axolotl.ai/docs/api/core.trainer_builder.html - 2025-04-29T20:19:47.079Z + 2025-04-29T21:09:28.570Z https://docs.axolotl.ai/docs/api/utils.schemas.utils.html - 2025-04-29T20:19:48.062Z + 2025-04-29T21:09:29.571Z https://docs.axolotl.ai/docs/api/kernels.lora.html - 2025-04-29T20:19:47.689Z + 2025-04-29T21:09:29.191Z https://docs.axolotl.ai/docs/api/prompt_strategies.bradley_terry.llama3.html - 2025-04-29T20:19:47.607Z + 2025-04-29T21:09:29.108Z https://docs.axolotl.ai/docs/api/core.trainers.grpo.trainer.html - 2025-04-29T20:19:47.433Z + 2025-04-29T21:09:28.932Z https://docs.axolotl.ai/docs/api/prompt_strategies.messages.chat.html - 2025-04-29T20:19:47.539Z + 2025-04-29T21:09:29.039Z https://docs.axolotl.ai/docs/api/utils.tokenization.html - 2025-04-29T20:19:47.869Z + 2025-04-29T21:09:29.375Z https://docs.axolotl.ai/docs/api/prompt_strategies.chat_template.html - 2025-04-29T20:19:47.453Z + 2025-04-29T21:09:28.952Z https://docs.axolotl.ai/docs/api/prompt_strategies.stepwise_supervised.html - 2025-04-29T20:19:47.517Z + 2025-04-29T21:09:29.017Z https://docs.axolotl.ai/docs/api/utils.samplers.multipack.html - 2025-04-29T20:19:48.213Z + 2025-04-29T21:09:29.725Z https://docs.axolotl.ai/docs/api/cli.args.html - 2025-04-29T20:19:47.271Z + 2025-04-29T21:09:28.766Z https://docs.axolotl.ai/docs/api/utils.callbacks.perplexity.html - 2025-04-29T20:19:48.219Z + 2025-04-29T21:09:29.731Z https://docs.axolotl.ai/docs/api/utils.gradient_checkpointing.unsloth.html - 2025-04-29T20:19:47.983Z + 2025-04-29T21:09:29.491Z https://docs.axolotl.ai/docs/mac.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/config.html - 2025-04-29T20:19:00.600Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/multimodal.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/lr_groups.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/dataset-formats/index.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/stepwise_supervised.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/pretraining.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/multi-node.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/sequence_parallelism.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/batch_vs_grad.html - 2025-04-29T20:19:00.600Z + 2025-04-29T21:08:46.841Z https://docs.axolotl.ai/docs/amd_hpc.html - 2025-04-29T20:19:00.600Z + 2025-04-29T21:08:46.841Z https://docs.axolotl.ai/docs/faq.html - 2025-04-29T20:19:00.602Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/custom_integrations.html - 2025-04-29T20:19:00.600Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/src/axolotl/integrations/LICENSE.html - 2025-04-29T20:19:00.622Z + 2025-04-29T21:08:46.862Z https://docs.axolotl.ai/TODO.html - 2025-04-29T20:19:00.598Z + 2025-04-29T21:08:46.840Z https://docs.axolotl.ai/src/axolotl/integrations/cut_cross_entropy/ACKNOWLEDGEMENTS.html - 2025-04-29T20:19:00.622Z + 2025-04-29T21:08:46.862Z https://docs.axolotl.ai/docs/getting-started.html - 2025-04-29T20:19:00.602Z + 2025-04-29T21:08:46.843Z https://docs.axolotl.ai/docs/multipack.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/multi-gpu.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/installation.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/cli.html - 2025-04-29T20:19:00.600Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/inst_tune.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/tokenized.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/conversation.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/dataset-formats/template_free.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/reward_modelling.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/lora_optims.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/nccl.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/api/logging_config.html - 2025-04-29T20:19:47.064Z + 2025-04-29T21:09:28.554Z https://docs.axolotl.ai/docs/api/utils.trainer.html - 2025-04-29T20:19:47.921Z + 2025-04-29T21:09:29.428Z https://docs.axolotl.ai/docs/api/monkeypatch.unsloth_.html - 2025-04-29T20:19:47.823Z + 2025-04-29T21:09:29.329Z https://docs.axolotl.ai/docs/api/cli.evaluate.html - 2025-04-29T20:19:47.254Z + 2025-04-29T21:09:28.749Z https://docs.axolotl.ai/docs/api/kernels.utils.html - 2025-04-29T20:19:47.718Z + 2025-04-29T21:09:29.222Z https://docs.axolotl.ai/docs/api/datasets.html - 2025-04-29T20:19:47.003Z + 2025-04-29T21:09:28.493Z https://docs.axolotl.ai/docs/api/utils.models.html - 2025-04-29T20:19:47.862Z + 2025-04-29T21:09:29.368Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.llama3.html - 2025-04-29T20:19:47.574Z + 2025-04-29T21:09:29.074Z https://docs.axolotl.ai/docs/api/cli.preprocess.html - 2025-04-29T20:19:47.337Z + 2025-04-29T21:09:28.835Z https://docs.axolotl.ai/docs/api/cli.merge_sharded_fsdp_weights.html - 2025-04-29T20:19:47.329Z + 2025-04-29T21:09:28.826Z https://docs.axolotl.ai/docs/api/integrations.base.html - 2025-04-29T20:19:48.134Z + 2025-04-29T21:09:29.644Z https://docs.axolotl.ai/docs/api/prompt_strategies.orpo.chat_template.html - 2025-04-29T20:19:47.603Z + 2025-04-29T21:09:29.104Z https://docs.axolotl.ai/docs/api/utils.schemas.enums.html - 2025-04-29T20:19:48.056Z + 2025-04-29T21:09:29.565Z https://docs.axolotl.ai/docs/api/utils.callbacks.comet_.html - 2025-04-29T20:19:48.232Z + 2025-04-29T21:09:29.756Z https://docs.axolotl.ai/docs/api/prompt_strategies.input_output.html - 2025-04-29T20:19:47.513Z + 2025-04-29T21:09:29.013Z https://docs.axolotl.ai/docs/api/utils.schedulers.html - 2025-04-29T20:19:47.945Z + 2025-04-29T21:09:29.452Z https://docs.axolotl.ai/docs/api/cli.cloud.base.html - 2025-04-29T20:19:47.383Z + 2025-04-29T21:09:28.881Z https://docs.axolotl.ai/docs/api/cli.utils.html - 2025-04-29T20:19:47.375Z + 2025-04-29T21:09:28.873Z https://docs.axolotl.ai/docs/api/monkeypatch.lora_kernels.html - 2025-04-29T20:19:47.795Z + 2025-04-29T21:09:29.300Z https://docs.axolotl.ai/docs/api/prompt_strategies.completion.html - 2025-04-29T20:19:47.507Z + 2025-04-29T21:09:29.007Z https://docs.axolotl.ai/docs/api/utils.schemas.config.html - 2025-04-29T20:19:47.991Z + 2025-04-29T21:09:29.499Z https://docs.axolotl.ai/docs/api/monkeypatch.llama_expand_mask.html - 2025-04-29T20:19:47.769Z + 2025-04-29T21:09:29.275Z https://docs.axolotl.ai/docs/api/cli.config.html - 2025-04-29T20:19:47.295Z + 2025-04-29T21:09:28.791Z https://docs.axolotl.ai/docs/api/utils.collators.core.html - 2025-04-29T20:19:48.177Z + 2025-04-29T21:09:29.689Z https://docs.axolotl.ai/docs/api/utils.data.sft.html - 2025-04-29T20:19:47.980Z + 2025-04-29T21:09:29.487Z https://docs.axolotl.ai/docs/api/integrations.spectrum.args.html - 2025-04-29T20:19:48.155Z + 2025-04-29T21:09:29.666Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.llama3.html - 2025-04-29T20:19:47.551Z + 2025-04-29T21:09:29.051Z https://docs.axolotl.ai/docs/api/cli.inference.html - 2025-04-29T20:19:47.309Z + 2025-04-29T21:09:28.805Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_instruct.html - 2025-04-29T20:19:47.468Z + 2025-04-29T21:09:28.967Z https://docs.axolotl.ai/docs/api/utils.collators.batching.html - 2025-04-29T20:19:48.196Z + 2025-04-29T21:09:29.708Z https://docs.axolotl.ai/docs/api/utils.schemas.datasets.html - 2025-04-29T20:19:48.020Z + 2025-04-29T21:09:29.529Z https://docs.axolotl.ai/docs/api/utils.data.pretraining.html - 2025-04-29T20:19:47.978Z + 2025-04-29T21:09:29.486Z https://docs.axolotl.ai/docs/api/prompt_strategies.dpo.passthrough.html - 2025-04-29T20:19:47.565Z + 2025-04-29T21:09:29.066Z https://docs.axolotl.ai/docs/api/monkeypatch.utils.html - 2025-04-29T20:19:47.803Z + 2025-04-29T21:09:29.308Z https://docs.axolotl.ai/docs/api/utils.callbacks.lisa.html - 2025-04-29T20:19:48.224Z + 2025-04-29T21:09:29.742Z https://docs.axolotl.ai/docs/api/prompt_strategies.alpaca_chat.html - 2025-04-29T20:19:47.466Z + 2025-04-29T21:09:28.966Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.user_defined.html - 2025-04-29T20:19:47.583Z + 2025-04-29T21:09:29.083Z https://docs.axolotl.ai/docs/api/core.training_args.html - 2025-04-29T20:19:47.167Z + 2025-04-29T21:09:28.658Z https://docs.axolotl.ai/docs/api/utils.collators.mm_chat.html - 2025-04-29T20:19:48.205Z + 2025-04-29T21:09:29.717Z https://docs.axolotl.ai/docs/api/common.architectures.html - 2025-04-29T20:19:48.157Z + 2025-04-29T21:09:29.668Z https://docs.axolotl.ai/docs/api/integrations.kd.trainer.html - 2025-04-29T20:19:48.146Z + 2025-04-29T21:09:29.656Z https://docs.axolotl.ai/docs/api/prompt_strategies.llama2_chat.html - 2025-04-29T20:19:47.501Z + 2025-04-29T21:09:29.001Z https://docs.axolotl.ai/docs/api/core.trainers.trl.html - 2025-04-29T20:19:47.421Z + 2025-04-29T21:09:28.919Z https://docs.axolotl.ai/docs/api/prompt_strategies.orcamini.html - 2025-04-29T20:19:47.528Z + 2025-04-29T21:09:29.028Z https://docs.axolotl.ai/docs/api/integrations.cut_cross_entropy.args.html - 2025-04-29T20:19:48.137Z + 2025-04-29T21:09:29.647Z https://docs.axolotl.ai/docs/api/utils.lora.html - 2025-04-29T20:19:47.884Z + 2025-04-29T21:09:29.390Z https://docs.axolotl.ai/docs/api/cli.sweeps.html - 2025-04-29T20:19:47.343Z + 2025-04-29T21:09:28.841Z https://docs.axolotl.ai/docs/api/core.chat.format.chatml.html - 2025-04-29T20:19:47.191Z + 2025-04-29T21:09:28.683Z https://docs.axolotl.ai/docs/api/common.const.html - 2025-04-29T20:19:48.158Z + 2025-04-29T21:09:29.669Z https://docs.axolotl.ai/docs/api/prompt_strategies.metharme.html - 2025-04-29T20:19:47.524Z + 2025-04-29T21:09:29.024Z https://docs.axolotl.ai/docs/api/monkeypatch.stablelm_attn_hijack_flash.html - 2025-04-29T20:19:47.812Z + 2025-04-29T21:09:29.317Z https://docs.axolotl.ai/docs/api/integrations.grokfast.optimizer.html - 2025-04-29T20:19:48.138Z + 2025-04-29T21:09:29.649Z https://docs.axolotl.ai/docs/api/cli.checks.html - 2025-04-29T20:19:47.277Z + 2025-04-29T21:09:28.773Z https://docs.axolotl.ai/docs/api/common.datasets.html - 2025-04-29T20:19:48.175Z + 2025-04-29T21:09:29.686Z https://docs.axolotl.ai/docs/api/integrations.lm_eval.args.html - 2025-04-29T20:19:48.152Z + 2025-04-29T21:09:29.663Z https://docs.axolotl.ai/docs/api/monkeypatch.mixtral.html - 2025-04-29T20:19:47.833Z + 2025-04-29T21:09:29.338Z https://docs.axolotl.ai/docs/api/prompt_strategies.kto.chatml.html - 2025-04-29T20:19:47.582Z + 2025-04-29T21:09:29.082Z https://docs.axolotl.ai/docs/api/cli.train.html - 2025-04-29T20:19:47.246Z + 2025-04-29T21:09:28.740Z https://docs.axolotl.ai/docs/api/prompt_tokenizers.html - 2025-04-29T20:19:47.059Z + 2025-04-29T21:09:28.549Z https://docs.axolotl.ai/docs/api/utils.schemas.training.html - 2025-04-29T20:19:48.003Z + 2025-04-29T21:09:29.511Z https://docs.axolotl.ai/docs/api/monkeypatch.relora.html - 2025-04-29T20:19:47.768Z + 2025-04-29T21:09:29.273Z https://docs.axolotl.ai/docs/api/cli.cloud.modal_.html - 2025-04-29T20:19:47.389Z + 2025-04-29T21:09:28.887Z https://docs.axolotl.ai/docs/api/cli.main.html - 2025-04-29T20:19:47.238Z + 2025-04-29T21:09:28.732Z https://docs.axolotl.ai/docs/api/monkeypatch.trainer_fsdp_optim.html - 2025-04-29T20:19:47.815Z + 2025-04-29T21:09:29.320Z https://docs.axolotl.ai/docs/fsdp_qlora.html - 2025-04-29T20:19:00.602Z + 2025-04-29T21:08:46.843Z https://docs.axolotl.ai/docs/debugging.html - 2025-04-29T20:19:00.601Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/ray-integration.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/docs/docker.html - 2025-04-29T20:19:00.602Z + 2025-04-29T21:08:46.842Z https://docs.axolotl.ai/docs/inference.html - 2025-04-29T20:19:00.605Z + 2025-04-29T21:08:46.845Z https://docs.axolotl.ai/docs/torchao.html - 2025-04-29T20:19:00.606Z + 2025-04-29T21:08:46.846Z https://docs.axolotl.ai/FAQS.html - 2025-04-29T20:19:00.598Z + 2025-04-29T21:08:46.840Z