Update max_position_embeddings to 128k context size instead of 32k
#4
by
qwopqwop
- opened
- Update max_position_embeddings to 128k context size instead of 32k
https://huggingface.co/mistralai/Mistral-Large-Instruct-2407/commit/5c9ce5b5f7a7ad62d03e8c66c719b66d586de26b
qwopqwop
changed pull request title from
Update max_position_embeddings to 128k context size instead of 32k(https://huggingface.co/mistralai/Mistral-Large-Instruct-2407/commit/5c9ce5b5f7a7ad62d03e8c66c719b66d586de26b)
to Update max_position_embeddings to 128k context size instead of 32k
TechxGenus
changed pull request status to
merged