hf-llm-api / apis /chat_api.py

Commit History

:boom: [Fix] Error 422 when max_tokens is null
a54e7a6

Hansimov commited on

:zap: [Enhance] Auto calculate max_tokens if not set
1b9f698

Hansimov commited on

:boom: [Fix] Ignore invalid HF Token
8ab8ca6

Hansimov commited on

:zap: [Enhance] Update available models description
c769be6

Hansimov commited on

:zap: [Enhance] Support provide api_key with os env HF_TOKEN
bc384a3

Hansimov commited on

:gem: [Feature] Support call hf api with api_key via HTTP Bearer
2da6968

Hansimov commited on

:gem: [Feature] Support no-stream mode with dict response
d2b20f2

Hansimov commited on

:boom: [Fix] Suppress ping message by increasing ping interval
489b65b

Hansimov commited on

:zap: [Enhance] New models: mistral-7b and openchat-3.5
e916990

Hansimov commited on

:zap: [Enhance] Default models and max_new_tokens
6aa8b86

Hansimov commited on

:gem: [Feature] New ArgParser: Specify server ip, port and running mode
e28221f

Hansimov commited on

:gem: [Feature] New ChatAPIApp: Enable fastapi for openai format api call
3a09006

Hansimov commited on