LLama CCP guide
#1
by
Euchale
- opened
Hi David,
I was wondering what the "-c 2048 -ngl 99" does at the end of the model. Not necessarily because I want to run the model from the guide, but I am interested in running models directly in llama cpp from now on. Is there any guide you would recommend? Or should I just use Kobold cpp instead and hope that its always quickly updated?
Thanks!
@Euchale
Hey;
Server guide is here:
https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md
;
Why is this not in their wiki section?! Thank you