Context Size

#1
by rengecky - opened

What is the context size of this model? According to Mistral documentation, the model was trained on 8192 tokens, while Llama2 has 4096 tokens.
Thanks for the reply and for your work in general!

Should be the same as mistral OG

Sign up or log in to comment