Sequence Length or Prompt Token Limit: Too Low!!
#4
by
Ekolawole
- opened
2048 prompt token limit is too low. We need models with 30,000+ Sequence Length or prompt limit. This is the true sign of improvement. The Model only allows 2048 input token, which is the same with every other model out there
Hi Emmanuel,
We have experimented internally with finetuning to longer sequence lengths, and it works quite well. Would be happy to see some community versions of the models exploring this idea π€.
FalconLLM
changed discussion status to
closed
Are there any plans to release some of the experiments?
Is it possible to use AliBI for longer sequences or is this a dead end?
Yeah, the 2048 limit is too short, we are working with 30k tokens and need help to move ahead on that
is there anyway I could contribute to that?