Logo
Explore Help
Sign In
tocmo0nlord/axolotl
1
0
Fork 0
You've already forked axolotl
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
431888c1de9b8059f5f8d80f73556e5df09d594f
axolotl/tests/e2e/multigpu
History
Wing Lian 431888c1de use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
..
patched
upgrade transformers to 5.4.0 (#3562)
2026-03-31 19:15:59 -04:00
solo
transformers v5 upgrade (#3272)
2026-01-27 17:08:24 -05:00
__init__.py
Attempt to run multigpu in PR CI for now to ensure it works (#1815) [skip ci]
2024-08-09 11:50:13 -04:00
test_dist_muon_fsdp2.py
use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
test_eval.py
Add ruff, remove black, isort, flake8, pylint (#3092)
2025-08-23 23:37:33 -04:00
test_fp8_fsdp2.py
transformers v5 upgrade (#3272)
2026-01-27 17:08:24 -05:00
test_fsdp1.py
use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
test_fsdp2_lora_kernels.py
use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
test_fsdp2.py
use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
test_gemma3.py
transformers v5 upgrade (#3272)
2026-01-27 17:08:24 -05:00
test_llama.py
transformers v5 upgrade (#3272)
2026-01-27 17:08:24 -05:00
test_locking.py
Data loader refactor (#2707)
2025-06-10 19:53:07 -04:00
test_ray.py
When using Ray use prepare for dataloader fixes (#3198)
2025-10-08 10:43:41 -04:00
test_tp.py
use smaller pretrained models for ci
2026-04-23 13:51:01 +00:00
Powered by Gitea Version: 1.25.4 Page: 315ms Template: 32ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API