Some problems again. They again found some significant error in llama.cpp.
Found this post on Reddit. Looks like we have some other problems.
https://www.reddit.com/r/LocalLLaMA/comments/1cltac3/part3_cause_to_issue_found_possible_bug_llama3/
Judging by what I read, there are some problems with the regular expression, it is necessary to replace the regular expression in this line: https://github.com/ggerganov/llama.cpp/issues/7062#issuecomment-2097033871
"To live is to suffer..."
We can't catch a break. Eventually will redo and re-upload quants but I'll let the dust settle.
I just wish they found a way to fix this and port it to convert.py
I know right... Honestly I'm just happy quants will be even better then. At least will try to look at the positive side.
Personally, I’ll probably take a short break from all this and see in a couple of days what changes. It would certainly be nice if they finally solved the problems with this, because it’s unpleasant to realize that at some point I began to involuntarily create slightly broken models.
@Lewdiculous
The topic was recently closed. One of the users in that thread suggested this solution to the problem of data loss.
https://github.com/ggerganov/llama.cpp/issues/7062#issuecomment-2102007944
Although I certainly could have mixed something up, if so I apologize in advance.
@Lewdiculous
I've probably already pissed you off, but I can't stop following the changes. That's what they did.
https://github.com/ggerganov/llama.cpp/releases/tag/b2831
Not at all, thanks for keeping my posted on this. Will look into it later and see how it goes.