Does this allow for 32k context?

#1
by AS1200 - opened

When I load this model into text gen web ui, I see that the default context value is set to 32k. Is this an error or a correct value?

If so, should I set compress_pos_emb to 8? I heard that if compress_pos_emb is equal to 8, then the model can use the increased context. And will this work if I use a custom context value on 4 thousand? As far as I know, the more the model supports context, the more closely it studies the correspondence between the user and the bot, including information about the character used. I would like to use my default context of 4 thousand but be sure that the model is very attentive to what happens within this context.

Sign up or log in to comment