Concedo-llamacpp

This is a placeholder model used for a llamacpp powered KoboldAI API emulator by Concedo. This is NOT llama. Do not download or use this model directly.

It is required to ensure the API works correctly within the official KoboldAI client.

Check out https://github.com/LostRuins/llamacpp-for-kobold for more information.

All feedback and comments can be directed to Concedo on the KoboldAI discord.

Downloads last month
350
GGUF
Model size
58.3M params
Architecture
llama
Inference Examples
Inference API (serverless) has been turned off for this model.

Spaces using concedo/koboldcpp 3