--- license: other language: - en pipeline_tag: text-generation inference: false tags: - transformers - gguf - imatrix - Nous-Hermes-Llama2-13b --- Quantizations of https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b ### Inference Clients/UIs * [llama.cpp](https://github.com/ggerganov/llama.cpp) * [JanAI](https://github.com/janhq/jan) * [KoboldCPP](https://github.com/LostRuins/koboldcpp) * [text-generation-webui](https://github.com/oobabooga/text-generation-webui) * [ollama](https://github.com/ollama/ollama) * [GPT4All](https://github.com/nomic-ai/gpt4all) --- # From original readme ## Prompt Format The model follows the Alpaca prompt format: ``` ### Instruction: ### Response: ``` or ``` ### Instruction: ### Input: ### Response: ```