Logo
Explore Help
Sign In
tocmo0nlord/axolotl
1
0
Fork 0
You've already forked axolotl
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
e85cbb86455b2c9f8d47d1b9fb69f7e4e4068082
axolotl/tests/e2e/patched/lora_kernels
History
Wing Lian 198d775d6d make sure the all of the model is on the same device, so this test will pass on multigpu (#2524) [skip ci]
2025-04-15 22:15:42 -07:00
..
__init__.py
Activation function Triton kernels, LoRA custom autograd functions (#2324)
2025-02-17 14:23:15 -05:00
test_lora_kernel_patching.py
make sure the all of the model is on the same device, so this test will pass on multigpu (#2524) [skip ci]
2025-04-15 22:15:42 -07:00
Powered by Gitea Version: 1.25.4 Page: 110ms Template: 1ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API