Mistral flash attn packing (#646)

* add mistral monkeypatch

* add arg for decoder attention masl

* fix lint for duplicate code

* make sure to update transformers too

* tweak install for e2e

* move mistral patch to conditional
This commit is contained in:
Wing Lian
2023-09-27 18:41:00 -04:00
committed by GitHub
parent 85b0be2ba7
commit b6ab8aad62
4 changed files with 412 additions and 4 deletions

View File

@@ -44,7 +44,7 @@ jobs:
- name: Install dependencies
run: |
pip3 install -e .
pip3 install -U -e .
pip3 install -r requirements-tests.txt
- name: Run tests
@@ -69,8 +69,7 @@ jobs:
- name: Install dependencies
run: |
pip3 install -e .
pip3 install flash-attn
pip3 install -U -e .[flash-attn]
pip3 install -r requirements-tests.txt
- name: Run e2e tests