Text Generation
Transformers
PyTorch
mistral
text-generation-inference
Inference Endpoints

How to run inference

#2
by MahaultA - opened

Hi there!
I'm trying to use your model.
But there's nowhere in the the documentation or the repo that I can find an explanation of how to actually use your model to answer queries? What is the code I have to actually write to give inputs and get outputs?
Thank you

HI! did you found any way to run inference?

https://colab.research.google.com/drive/13YKPxye12UP_1SMzUhRNAv_bdEVb89kB#scrollTo=Wr_BNAPehPqf

Hi! It seems that the link needs your authorization to open. Can you reshare one or approve my application?
Thanks a lot~

Sign up or log in to comment