Llama 3.1 crashing Kobold
is anyone else having kobold using a gguf have it CTD?
hi hi!
you use LLama? My older Llama works fine.. this new 3.1 I dont know if its too powerful or what the deal is?
not the right place to talk about that dude
why? because of the program I use Kobold? whats wrong with that?
dude, you're talking about lumimaid in meta-llama/Llama-3.3-70B-Instruct.
And even then using kobold would be GGUF, this is a safetensors model
No the models I found were GGUF. I wanted to use the 3.3 model.. Yes I know THIS model is a safe tenors.. If I cant even get the 3.1 for the gguf model to work why would I request an upgrade?? Thats the point im trying to make with my question. Sorry it wasnt clear to the community.
I don't think any of us really understand what you're trying to say
you're asking about Lumimaid, someone else's tune of a Llama model, in GGUF format, crashing in Koboldcpp
Meta's Llama 3.3 model has absolutely no relation to Lumimaid, GGUF, or Koboldcpp, so this isn't the appropriate place to ask
Either try on the original model page where you downloaded or on Koboldcpp's github/discord
You not helping Bartoski, at this point of you following every post I make and harrasing me, I am going to say it Here. LEAVE ME ALONE
LEAVE ME ALONE
One more time since cyber law requires it.. LEAVE ME ALONE.
Now I get to screen shot this and send the file off.
Leave me alone dude