mistralai_Mistral-Nemo-Instruct-2407-exl2-4bpw

This is a 4.0bpw quantized version of mistralai/Mistral-Nemo-Instruct-2407 made with exllamav2.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Collection including DrNicefellow/Mistral-Nemo-Instruct-2407-exl2-4bpw