max lenght?

#13
by PeppePasti - opened

In the paper is mentioned a 2048 tokens, while here in the model card i see max 512, which on can i use so for better performances?

Nomic AI org

it's 512, 2048 was for v1. however you may be able to scale longer since we use RoPe but YMMV

zpn changed discussion status to closed

it's 512, 2048 was for v1. however you may be able to scale longer since we use RoPe but YMMV

Could you share the script to scale to longer context?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment