Context length
#4
by
ryanse
- opened
Hi,
Is there a reason the tokenizer has a model_max_length
of 16384
?
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('deepseek-ai/DeepSeek-Coder-V2-Instruct-0724')
tokenizer.model_max_length
> 16384
Thanks!