metadata
license: mit
This is the quantized (INT8) ONNX variant of the bge-base-en-v1.5 model for embeddings created with DeepSparse Optimum for ONNX export/inference pipeline and Neural Magic's Sparsify for One-Shot quantization.
license: mit
This is the quantized (INT8) ONNX variant of the bge-base-en-v1.5 model for embeddings created with DeepSparse Optimum for ONNX export/inference pipeline and Neural Magic's Sparsify for One-Shot quantization.