Is there a way to support > 512 token length?

#5
by kk3dmax - opened

Is there a way to support > 512 token length?

Will modify max_position_embeddings from 514 to 2048 help?

Beijing Academy of Artificial Intelligence org

The model doesn't support > 512 tokens. Since the max position embeddings is 512 during training, directly modifying the max_position_embeddings cannot perform well.
The next version of reranker will support longer text.

Do we have an ETA of the new version? Say 1 month later or 3 months later...

Beijing Academy of Artificial Intelligence org

We plan to release the next version of BGE within 2 months.

Thank you for your reply!

Hi,

I have the similar interest as @kk3dmax .

Hope to see the new version soon!

Hi there, I am wondering where could we find the current supported max token length now?

Sign up or log in to comment