This model was converted to OpenVINO from ibm-granite/granite-8b-code-instruct-4k
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "NitroLLM/granite-8b-code-instruct-4k-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model authors have turned it off explicitly.
Model tree for NitroLLM/granite-8b-code-instruct-4k-openvino
Base model
ibm-granite/granite-8b-code-base-4k
Finetuned
ibm-granite/granite-8b-code-instruct-4k
Datasets used to train NitroLLM/granite-8b-code-instruct-4k-openvino
Evaluation results
- pass@1 on HumanEvalSynthesis(Python)self-reported57.900
- pass@1 on HumanEvalSynthesis(Python)self-reported52.400
- pass@1 on HumanEvalSynthesis(Python)self-reported58.500
- pass@1 on HumanEvalSynthesis(Python)self-reported43.300
- pass@1 on HumanEvalSynthesis(Python)self-reported48.200
- pass@1 on HumanEvalSynthesis(Python)self-reported37.200
- pass@1 on HumanEvalSynthesis(Python)self-reported53.000
- pass@1 on HumanEvalSynthesis(Python)self-reported42.700
- pass@1 on HumanEvalSynthesis(Python)self-reported52.400
- pass@1 on HumanEvalSynthesis(Python)self-reported36.600