Overview

mistralai developed and released the Mistral-Nemo family of large language models (LLMs).

Variants

No Variant Cortex CLI command
2 gguf cortex run mistral-nemo:gguf
3 main/default cortex run mistral-nemo

Use it with Jan (UI)

  1. Install Jan using Quickstart
  2. Use in Jan model Hub:
    cortexso/mistral-nemo
    

Use it with Cortex (CLI)

  1. Install Cortex using Quickstart
  2. Run the model with command:
    cortex run mistral-nemo
    

Credits

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including cortexso/mistral-nemo