Producing garbage output
#1
by
WeiCon
- opened
Some of your GGUF quantizations seem to lead to broken output, see here (NSFW ahead warning):
- http://ayumi.m8geil.de/results_v3/model_resp_DL_20240107_7B-Q6_K_NSFW_Noromaid_Zephyr.html
- http://ayumi.m8geil.de/results_v3/model_resp_DL_20240104_11B-Q6_K_Velara_V2.html
- http://ayumi.m8geil.de/results_v3/model_resp_DL_20240106_7B-Q6_K_NeuralMaid.html
At least with Velara V2 I know that it is not the source model, because the quantization of Velara V2 from TheBloke works.
Hi, thanks a lot! I have to review my code then because there is almost no variety in the outpu. I probably have to reconvert some of my models, thanks !
s3nh
changed discussion status to
closed
https://huggingface.co/s3nh/Velara-11B-V2-GGUF Now it works properly as the rest of my model. Thanks