General discussion - Poppy_Porpoise-1.0-L3-8B

#1
by Lewdiculous - opened

For quant issues or tag the author/comment on their original card to provide feedback from your usage.

Lewdiculous pinned discussion

I look forward to trying this, sadly I am having some weird select new api error for ooba but since I use koboldcpp, I cant select the legacy api option. I am about to try the kobold release before the newest maybe that will roll it back till this is sorted, I dunno!

Is there a GitHub issue about that? :o

There was not, but I did just like a moment ago figure out the weirdly obscure issue. you have to put http://localhost:5001/v1/ as the api key it saves and under it where you connect ST to it. The only way I figured it out was I randomly saw open router api in the cmd line of kobold on load I had never noticed before and just tried it. Things seem to work fine now.

Strange, I've always used only http://localhost:6969 from KCPP but if it's working that's good news. For me I often had problems with ports, though not this one.

There was not, but I did just like a moment ago figure out the weirdly obscure issue. you have to put http://localhost:5001/v1/ as the api key it saves and under it where you connect ST to it. The only way I figured it out was I randomly saw open router api in the cmd line of kobold on load I had never noticed before and just tried it. Things seem to work fine now.

I found /v1 is a OpenAI API, this site lets you test your apis
https://lite.koboldai.net/koboldcpp_api#/

So I should not be using that if I want to be completely local? Then I wonder what I should be putting so I don't get that error

It's still local. It's just an OpenAI compatible endpoint.

In the Connection Tab β†’ Text Completion β†’ KoboldCpp, you should be able to connect with just your http://localhost:5001. Try using different a --port.

Weird that is what I was using when I was having that error. but going back now it just works. Maybe I just botched the update since I redid it earlier. I moved to the new build ST has going on.

Just to be clear, the quants are v.1.0, even though they are named 0.85? Thx :)

@Blobbinski - heya! Yes, that's correct. The model got updated and the name changed, but the files should be the same.. I just forgot to rename the files. Done now.

Lewdiculous changed discussion title from General discussion - Poppy_Porpoise-0.85-L3-8B to General discussion - Poppy_Porpoise-1.0-L3-8B

Use version 0.72 instead.
This model version is now deprecated since the author has identified significant issues with it. Hopefully the next iterations will be better. It was worth trying.

Sign up or log in to comment