Text Generation
Transformers
Safetensors
English
llama
conversational
Inference Endpoints
text-generation-inference

Expanding the window size

#12
by agershun - opened

Is there any way to increase the size of the context window for this LLM?

There are some solutions for Mistal to expand the window size to 8K or 16K. Are they applicable to this model?

upstage org

Since we are using the llama2 architecture, there are numerous ways to expand. πŸ› οΈ

hunkim changed discussion status to closed

Sign up or log in to comment