3 epochs

#2
by KnutJaegersberg - opened

This 30b model seems to be trained with 3 epochs and would be handy to have in int4:
https://huggingface.co/baseten/alpaca-30b/blob/main/README.md?code=true#L9

This model here might be of interest for conversion, too:
https://huggingface.co/circulus/alpaca-13b

The first link is LoRA for 3 epochs, not finetuned on the original dataset, so I believe it might have the same issues as this model not replicating the original results well. I'll still quantize it to 4bits and test it out.

The second link is dead for me.

There seem to be LoRA weights plus original model converted to 4int, though I was surprised it worked.
The second link seems to have been taken down. Not sure, but I think I told them a file was missing.
Peace!

KnutJaegersberg changed discussion status to closed

Sign up or log in to comment