How to increase the max length of the output?

#7
by AeroDEmi - opened

I want to fine-tune this model but some of my targets have more than 1024 tokens.

I thought that maybe with: processor.tokenizer.model_max_length = 2048 and model.config.max_position_embeddings = 2048 that should do it...

But, I'm facing a dimension error:
The size of tensor a (2048) must match the size of tensor b (1024) at non-singleton dimension 1

Sign up or log in to comment