This model was converted to OpenVINO from Qwen/Qwen2.5-0.5B
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "piimaila/Qwen2.5-0.5B-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)
- Downloads last month
- 20
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for piimaila/Qwen2.5-0.5B-openvino
Base model
Qwen/Qwen2.5-0.5B