Quantization support?

#2
by Corianas - opened

Hi, Do you know if this would support the model_q8f16.onnx in the same directory the model is already loading from and see if this enhances things even further?

Sign up or log in to comment