git+https://github.com/Dao-AILab/flash-attention.git torch==2.4.0 transformers peft==0.11.1 timm