A 3.75bpw version

#2
by deleted - opened
deleted

Would you consider creating a 3.75bpw quantized version?

My server disk space is very limited, and I don't have enough disk space to work with the full-sized model required for quantization.Whether or not you're able to create this 3.75bpw version, I'm very grateful for your contributions. Your work is making these powerful models more accessible to the community.

Thank you very much.

Hi, I am sorry I only make quants I use myself or for my models.

altomek changed discussion status to closed

Sign up or log in to comment