Is 8k superhot possible?

#2
by rombodawg - opened

Since this is technically just wizardLM trained on the falcon dataset, can it be made into an superhot model with 8k context? Like how you made the WizardLM-33B-V1-0-Uncensored-SuperHOT-8K-GPTQ model. If its possible it would be greatly appreciated. I've been waiting for a larger quantized model in 8k as a free competitor to gpt 3.5.

To make the SuperHOT requires an 8K trained LoRA. So far those are only released for Llama. If one comes out I will do it. Or I might see about making my own if I have time .

Sweet, ill look forward to the gptq model if you end up releasing it!

Sign up or log in to comment