Dead Model - No longer works with latest oobabooga
Unfortunately this no longer works with the latest version of oobabooga.
I tried Auto-GPTQ and Llama for GPTQ - None of which worked with the latest version. I did get it to load for one second some how but it just spat out a lot of garbage text.
Traceback (most recent call last): File “D:\AI\UI\text-generation-webui\server.py”, line 62, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name, loader) File “D:\AI\UI\text-generation-webui\modules\models.py”, line 65, in load_model output = load_func_maploader File “D:\AI\UI\text-generation-webui\modules\models.py”, line 263, in GPTQ_loader model = modules.GPTQ_loader.load_quantized(model_name) File “D:\AI\UI\text-generation-webui\modules\GPTQ_loader.py”, line 163, in load_quantized exit() File “D:\AI\UI\installer_files\env\lib_sitebuiltins.py”, line 26, in call raise SystemExit(code) SystemExit: None
I have tried Llama Precise and Novel AI Story Writer. No matter what character or prompt I try, everything comes out as gibberish.