File size: 898 Bytes
12c3986
 
 
f020f92
 
9a63c86
 
d156677
9a63c86
 
f020f92
5c08a7d
 
f020f92
 
 
 
 
ce65180
 
9a63c86
a1173c5
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
---
license: mit
---

# NousResearch - Obsidian-3B-V0.5 gguf q6 version by Nisten
To run the server inside /llama.cpp/ folder IN YOUR TERMINAL 

## ./server -m obsidian-q6.gguf --mmproj mmproj-obsidian-f16.gguf -ngl 42

that's it, it's literally one command, you open your browser now at http://127.0.0.1:8080

## FIRST TIME TO RUN mac or linux, make a new folder, THIS WILL DOWNLOAD AND RUN EVERYTHIN whole in ONE SHOT
## JUST OPEN A FOLDER, OPEN TERMINAL, CD TO FOLDER, AND COPY PASTE THIS, THAT'S IT

```bash
git clone -b stablelm-support https://github.com/Galunid/llama.cpp.git && \
cd llama.cpp && \
make && \
wget https://huggingface.co/nisten/obsidian-3b-multimodal-q6-gguf/resolve/main/obsidian-q6.gguf && \
wget https://huggingface.co/nisten/obsidian-3b-multimodal-q6-gguf/resolve/main/mmproj-obsidian-f16.gguf && \
./server -m obsidian-q6.gguf --mmproj mmproj-obsidian-f16.gguf -ngl 42