"TypeError: phi3 isn't supported yet ". Could not quantize the phi3 model using the AWQ quantization method.

#20
by rabinghimire737 - opened

I was doing quantization using the AWQ quantization techniques of the Model Phi3-mini-128k-instruct . But I get the response as "TypeError: phi3 isn't supported yet." The Quantization config file is given below:
quant_config = {"zero_point": True, "q_group_size": 128, "w_bit": 4, "version":"GEMM"}

I encounter this problem too, so I check the AWQ Repo's code, it though do support phi3 already. Don't knowing what's going on.

hi, I found that phi 3 small's config model_type is phi3small, different from mini/medium: phi3, and AWQ only support "phi3" (no idea why the small one is completely different from the 2 others)

Same here

Sign up or log in to comment