not getting coherent results from inferencing this in llama.cpp

#1
by nisten - opened

I tried long context, short, format no format and not getting coherent results from gguf files from this model.

Not sure if it is due to llama.cpp or the model itself

Screenshot 2024-04-26 at 3.57.42 PM.png

Screenshot 2024-04-26 at 3.57.10 PM.png

Screenshot 2024-04-26 at 3.56.39 PM.png

Sign up or log in to comment