This open-source model was created by Mistral AI.
You can find the release blog post here.
The model is available on the huggingface hub: https://huggingface.co/mistralai/Mistral-Large-Instruct-2407.
The 123B model supports up to 128K token context windows.