adding to transformers officially?
#9 opened 4 days ago
by
pszemraj
Is flash-attention-2 suppported
#8 opened 3 months ago
by
Jack7777777
Xformer support for Qwen1.5B
3
#6 opened 3 months ago
by
le723z
backbone模型不开源吗?
10
#4 opened 6 months ago
by
JaheimLee
Disable trust_remote_code
14
#2 opened 7 months ago
by
veeravignesh