mpt_7b_chat-dense_quant_linearW8A8MatMul8Embeds8LMhead8

import deepsparse
from huggingface_hub import snapshot_download

MODEL_PATH = snapshot_download(repo_id="mgoin/mpt-7b-chat-quant")
model = deepsparse.Pipeline.create(task="text-generation", model_path=MODEL_PATH)
model(sequences="Tell me a joke.")
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.