• changed to 4k context length (Phi-3-mini-4k-instruct)

The information about the context length in the model card are contradicting. Does this GGUF model have a context length of either 4k or 128k tokens?

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment