Instructions on how to load this?

#2
by unluckyton - opened

Is this supposed to work on a 4090? New to add this and need some help.

I downloaded this into my koboldai's models folder. I tried it load it as is but got errors. I found https://huggingface.co/ausboss/llama-30b-supercot/tree/main and copied their pytorch_model-00001-of-00243.bin files into the directory, tried loading again. It started but ran out of VRAM at ~30%. Help please? Thanks

You don't need those bin files. You do need the Occam latestgptq branch of koboldAI for 4bit model support. https://github.com/0cc4m/KoboldAI

Thanks, got it to work with that, appreciate the assistance, if I want to use textgen webui is the main repo fine or is there a special repo I should try for that too.

I don't use ooba, but I think the main repo is fine with that one

Sign up or log in to comment