API access - add /models endpoint.

#21
by antonhugs - opened

This is an ask to add a "/models" endpoint to your API (https://console.upstage.ai/).

Why?
Many existing ui-clients call /models first to understand what model name to use for chat completion.
E.g.: Im open-webui user. It is not possible to integrate your API into it since your backend does not implement /models enpoint. open-webui does not understand what model name to use.

HF is probably not a place for such a request. Anyhow I hope you can forward an ask to your backend peeps.

PS. Tried model locally and it works great. 4K context is sad. However it is more than enough for quick questions.

Cheers!

upstage org

Hi @antonhugs ! I am Nayeon from Upstage.
I'm glad to hear that you are enjoying using Solar. What use case are you working on?
Solar is available on Ollama, so you might consider using it through there. Thank you for your suggestion. We will discuss your suggestions. πŸ€—

Im using it with vllm however I'd rather use an API and not run it myself.
My use cases include simple one-off questions that include programming questions, rephrasing support replies, general info questions.
What i like about the model is that it sticks to system prompt better than anything else I tried. E.g. when I say limit responses to 2 sentences it does so when most other LLMs spit out BS by over-explaining itself.

antonhugs changed discussion status to closed

Sign up or log in to comment