* flash attn pip * add packaging * add packaging to apt get * install flash attn in dockerfile * remove unused whls * add wheel * clean up pr fix packaging requirement for ci upgrade pip for ci skip build isolation for requiremnents to get flash-attn working install flash-attn seperately * install wheel for ci * no flash-attn for basic cicd * install flash-attn as pip extras --------- Co-authored-by: Ubuntu <mgh@mgh-vm.wsyvwcia0jxedeyrchqg425tpb.ax.internal.cloudapp.net> Co-authored-by: mhenrichsen <some_email@hey.com> Co-authored-by: Mads Henrichsen <mads@BrbartiendeMads.lan> Co-authored-by: Wing Lian <wing.lian@gmail.com>
26 lines
477 B
Plaintext
26 lines
477 B
Plaintext
peft @ git+https://github.com/huggingface/peft.git
|
|
transformers @ git+https://github.com/huggingface/transformers.git
|
|
bitsandbytes>=0.41.1
|
|
accelerate @ git+https://github.com/huggingface/accelerate@2a289f6108e77a77a4efffb3f6316bc98538413b
|
|
addict
|
|
fire
|
|
PyYAML==6.0
|
|
datasets
|
|
flash-attn==2.0.8
|
|
sentencepiece
|
|
wandb
|
|
einops
|
|
xformers
|
|
optimum
|
|
hf_transfer
|
|
colorama
|
|
numba
|
|
numpy==1.24.4
|
|
# qlora things
|
|
bert-score==0.3.13
|
|
evaluate==0.4.0
|
|
rouge-score==0.1.2
|
|
scipy
|
|
scikit-learn==1.2.2
|
|
pynvml
|