* adding llama3 fastchat conversation monkeypatch * Updated conversation turns to work with PR3259 of FastChat * fixed bos token * bump fastchat version --------- Co-authored-by: Wing Lian <wing.lian@gmail.com>
45 lines
765 B
Plaintext
45 lines
765 B
Plaintext
--extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
|
|
packaging==23.2
|
|
peft==0.10.0
|
|
transformers @ git+https://github.com/huggingface/transformers.git@43d17c18360ac9c3d3491389328e2fe55fe8f9ce
|
|
tokenizers==0.15.0
|
|
bitsandbytes==0.43.0
|
|
accelerate==0.28.0
|
|
deepspeed==0.13.1
|
|
pydantic==2.6.3
|
|
addict
|
|
fire
|
|
PyYAML>=6.0
|
|
requests
|
|
datasets==2.15.0
|
|
flash-attn==2.5.5
|
|
sentencepiece
|
|
wandb
|
|
einops
|
|
xformers==0.0.22
|
|
optimum==1.16.2
|
|
hf_transfer
|
|
colorama
|
|
numba
|
|
numpy>=1.24.4
|
|
# qlora things
|
|
evaluate==0.4.1
|
|
scipy
|
|
scikit-learn==1.2.2
|
|
pynvml
|
|
art
|
|
fschat @ git+https://github.com/lm-sys/FastChat.git@27a05b04a35510afb1d767ae7e5990cbd278f8fe
|
|
gradio==3.50.2
|
|
tensorboard
|
|
|
|
mamba-ssm==1.2.0.post1
|
|
|
|
# remote filesystems
|
|
s3fs
|
|
gcsfs
|
|
# adlfs
|
|
|
|
trl==0.8.5
|
|
zstandard==0.22.0
|
|
fastcore
|