Can you increase the context length?

#3
by starmanj - opened

My discussions crash fairly quickly with error out of context memory. The model thinks it can use 4096 tokens of context.

It can be increased up to 16k and possibly further

Sign up or log in to comment