Maximum context length

#2
by Jackpot115 - opened

Hi!

The config.json file says that "max_position_embeddings: 512". Does this mean that the maximum length of the text input is 512 tokens? I'm more familiar with decoder-only LLMs where the context window is a lot longer, so I hope to check my understanding.

Thank you in advance!

Sign up or log in to comment