Context length of the model?

#30
by shipWr3ck - opened

I did not find that on the description page, what is the max context length of the model?

It's in the config.json:
"max_position_embeddings": 8192,

Google org

Hi @shipWr3ck , You can find the context window size in it's config.json which is 8192. Kindly go through config.json document and let me know if you have any doubts . Thank you.

as with all gemma 2 models, it's 8192.

however they are also compatible with SelfExtend to extend the context to 32768 without degradation.

you need a lot of RAM to perform this action though, A100 or better only.

Google org
This comment has been hidden
shipWr3ck changed discussion status to closed

I wish Gemma next release comatible at least 80k context length (should fit nice ly with H100 80G though). 8192 is rather a lab, useless.

I wish Gemma next release comatible at least 80k context length (should fit nice ly with H100 80G though). 8192 is rather a lab, useless.

If you expect a higher context length model from the Gemma series, you may want to stick around for Gemma 3. Maybe Google will make that a 128k model like Llama 3.1 and above already have.

Sign up or log in to comment