Feat: add ministral3 (#3297)

* feat: add ministral and mistral3

* chore: lint

* feat: update cce for ministral

* fix: add vram usage

* feat: update for release

* fix: save_pretrained issue in v5

* fix: add instructions to use v5 branch

* fix: add to multipack

* fix: improve instructions

* fix: add model to readme
This commit is contained in:
NanoCode012
2025-12-04 20:32:08 +07:00
committed by GitHub
parent 86d8cca149
commit 2b66ee189c
13 changed files with 314 additions and 20 deletions

View File

@@ -19,7 +19,7 @@ python scripts/cutcrossentropy_install.py | sh
- If you are installing from pip
```bash
pip3 uninstall -y cut-cross-entropy && pip3 install "cut-cross-entropy[transformers] @ git+https://github.com/axolotl-ai-cloud/ml-cross-entropy.git@5eff953"
pip3 uninstall -y cut-cross-entropy && pip3 install "cut-cross-entropy[transformers] @ git+https://github.com/axolotl-ai-cloud/ml-cross-entropy.git@f643b88"
```
## Usage
@@ -61,6 +61,8 @@ plugins:
- llama4
- llama4_text
- llava
- ministral
- ministral3
- mistral
- mistral3
- mixtral

View File

@@ -35,7 +35,7 @@ LOG = get_logger(__name__)
_CCE_INSTALL_MESSAGE = (
"Please install Axolotl's fork of cut_cross_entropy with transformers support using "
'`pip install "cut-cross-entropy[transformers] @ git+https://github.com/axolotl-ai-cloud/ml-cross-entropy.git@5eff953"`'
'`pip install "cut-cross-entropy[transformers] @ git+https://github.com/axolotl-ai-cloud/ml-cross-entropy.git@f643b88"`'
)

View File

@@ -52,6 +52,8 @@ SUPPORTED_MULTIPACK_MODEL_TYPES = [
"olmo",
"olmo2",
"olmo3",
"ministral",
"ministral3",
"afmoe",
]

View File

@@ -218,3 +218,10 @@ class HFMistralTokenizer(MistralCommonTokenizer):
model_input_names=model_input_names,
clean_up_tokenization_spaces=clean_up_tokenization_spaces,
)
def save_pretrained(self, *args, **kwargs) -> tuple[str, ...]:
"""
Patches to remove save_jinja_files from being passed onwards.
"""
kwargs.pop("save_jinja_files", None)
return super().save_pretrained(*args, **kwargs)