Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
|
2 |
-
|
3 |
|
4 |
llamafile lets you distribute and run LLMs with a single file. [announcement blog post](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
|
5 |
|
@@ -7,5 +7,6 @@ llamafile lets you distribute and run LLMs with a single file. [announcement blo
|
|
7 |
|
8 |
- [neuralhermes-2.5-mistral-7b.Q4_K_M.llamafile](https://huggingface.co/rabil/NeuralHermes-2.5-Mistral-7B-llamafile/resolve/main/neuralhermes-2.5-mistral-7b.Q4_K_M.llamafile)
|
9 |
- [neuralhermes-2.5-mistral-7b.Q5_K_M-server.llamafile](https://huggingface.co/rabil/NeuralHermes-2.5-Mistral-7B-llamafile/resolve/main/neuralhermes-2.5-mistral-7b.Q5_K_M-server.llamafile)
|
|
|
10 |
|
11 |
This repository was created using the [llamafile-builder](https://github.com/rabilrbl/llamafile-builder)
|
|
|
1 |
|
2 |
+
## NeuralHermes-2.5-Mistral-7B-llamafile
|
3 |
|
4 |
llamafile lets you distribute and run LLMs with a single file. [announcement blog post](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
|
5 |
|
|
|
7 |
|
8 |
- [neuralhermes-2.5-mistral-7b.Q4_K_M.llamafile](https://huggingface.co/rabil/NeuralHermes-2.5-Mistral-7B-llamafile/resolve/main/neuralhermes-2.5-mistral-7b.Q4_K_M.llamafile)
|
9 |
- [neuralhermes-2.5-mistral-7b.Q5_K_M-server.llamafile](https://huggingface.co/rabil/NeuralHermes-2.5-Mistral-7B-llamafile/resolve/main/neuralhermes-2.5-mistral-7b.Q5_K_M-server.llamafile)
|
10 |
+
- [neuralhermes-2.5-mistral-7b.Q8_0-server.llamafile](https://huggingface.co/rabil/NeuralHermes-2.5-Mistral-7B-llamafile/resolve/main/neuralhermes-2.5-mistral-7b.Q8_0-server.llamafile)
|
11 |
|
12 |
This repository was created using the [llamafile-builder](https://github.com/rabilrbl/llamafile-builder)
|