Trying to use this model (8bit, 32group) with text-generation-webui and failing

#1
by Vodnir - opened

No matter what loader or configurations I attempt to us, I am completely and utterly unable to get this model to load. Mind you, this is my first time every trying to do this and I may just be totally clueless to an obvious solution here... :S

I'm trying to run this on a PC (Windows 11) with 64 GB of RAM and an RTX 4090.

{
"bits": 8,
"group_size": 32,
"damp_percent": 0.1,
"desc_act": true,
"sym": true,
"true_sequential": true,
"model_name_or_path": null,
"model_file_base_name": "model"
}

Have you tried using 4 bits?
Shouldn't be a hardware issue. I'm running this with less RAM and a 3090 on Ooba in really good speed. 8k Context, 2,5 alpha value.

I've ran other models in 4 bit, sure. I was just very much interested in running it in 8bits... :|

Sign up or log in to comment