llama2-13b-megacode2-oasst

Prompt template

chatml format is used: "<|im_start|>user\n{user prompt}<|im_end|>\n<|im_start|>assistant\n{Assistant answer}<|im_end|>\n"

Multi-line:

<|im_start|>user
{user prompt}<|im_end|>
<|im_start|>assistant
{Assistant answer}<|im_end|>

Credits & Special Thanks

Downloads last month
1,066
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for OpenAssistant/llama2-13b-megacode2-oasst

Finetunes
1 model
Quantizations
3 models

Spaces using OpenAssistant/llama2-13b-megacode2-oasst 23