Context size

#2
by schnapper79 - opened

The original model supports a big context of 128k tokens. Your config.json says 32k. Can I still use 128k?

The original model supports a big context of 128k tokens. Your config.json says 32k. Can I still use 128k?

Hi! Sure can! It's how I run it. The problem was Mistral's original release had 32k in their default config.json and I quantized it before they corrected it
https://huggingface.co/mistralai/Mistral-Large-Instruct-2407/discussions/3

I should go in and edit that. Thanks for reminding me!

Ah I see. Thanks.

schnapper79 changed discussion status to closed

Sign up or log in to comment