File size: 1,087 Bytes
993a75b b7ce8f0 993a75b b7ce8f0 b4db246 393e67f 993a75b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
tags:
- llamafile
- GGUF
base_model: None
---
## TinyLlama-1.1B-Chat-v1.0-llamafile
llamafile lets you distribute and run LLMs with a single file. [announcement blog post](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
#### Downloads
- [tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile)
- [tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile)
- [tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile)
- [tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile)
This repository was created using the [llamafile-builder](https://github.com/rabilrbl/llamafile-builder)
|