Support for AWS Sagemaker deployment
#54 opened 17 days ago
by
samqwiet
Phi3.5 model selection
#53 opened about 2 months ago
by
DocDBrown

Switch import mechanism for flash_attn
#51 opened 4 months ago
by
nvwilliamz
ModuleNotFoundError: No module named 'transformers_modules.microsoft.Phi-3'
#49 opened 5 months ago
by
hsmanju
Model consistently gets into a loop to repeat itself if there is too much in the context window
4
#48 opened 5 months ago
by
mstachow
Resource Requirements to load and save model
#47 opened 5 months ago
by
nana123652
KeyError: 'factor'
#45 opened 6 months ago
by
surak

How much GPU is needed to load the Phi-3.5-MoE-instruct model
2
#44 opened 6 months ago
by
cyt78
QAT
3
#42 opened 6 months ago
by
rezzie-rich

ModuleNotFoundError: No module named 'triton'
#41 opened 6 months ago
by
Maximum2000
Cannot use transformer library to inference the
8
#40 opened 6 months ago
by
manishbaral
Validation loss
#39 opened 6 months ago
by
Mani5112
The model 'PhiMoEForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', .......
11
#34 opened 7 months ago
by
xxbadarxx
Only CPU is used during inference.
#33 opened 7 months ago
by
rockcat-miao
The provided example doesn't work
5
#32 opened 7 months ago
by
kqsong
need gguf
19
#4 opened 7 months ago
by
windkkk