File size: 531 Bytes
99de6d0
abb889e
 
 
99de6d0
 
1
2
3
4
5
6
7
<p>
    This open-source model was created by <a target="_blank" href="https://mistral.ai/">Mistral AI<a>.
    You can find the release blog post <a target="_blank" href="https://mistral.ai/news/mixtral-of-experts/">here</a>.
    The model is available on the huggingface hub:  <a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1">https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1</a>.
    The model has 46.7B total and 12.9B active parameters. It supports up to 32K token contexts.
</p>