"TypeError: Phi3 is not supported yet." while doing awq quantization using autoAWQ

#81
by rabinghimire737 - opened

I get the error "TypeError: Phi3 is not supported yet." while doing awq quantization using AutoAWQ. The quant config file is given as:
quant_config = {"zero_point": True, "q_group_size": 128, "w_bit": 4, "version":"GEMM"}

model = AutoAWQForCausalLM.from_pretrained(model_path, token=access_token, safetensors=True)
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True, token=access_token)

# # Quantize
model.quantize(tokenizer, quant_config=quant_config)

# # save model weights
model.save_quantized(quant_name)
tokenizer.save_pretrained(quant_name)

Can anyone give suggestions on it?

Same issue here

Sign up or log in to comment