what are use cases , it is deranged like Joe Biden

#10
by froilo - opened

It is deranged like Joe Biden. Completely incoherent...

I am not trolling.
Maybe i am missing something ...

How to use it?
What to use it for?

maybe it was meant to give you the feeling of talking to joe biden?

@froilo
obviously dont expect it to be near even a 7b mistral model which alone is 7x the size of this one.

It could serve for extremely fast inference on a edge device and most devices cant run 7b models. They barely fit 7b quantized models.
You could also finetune it for niche tasks which will be easy as it takes very little vram.
You could also use it as a speculative decoder model which will make larger llama models such as 7b llama 2 or 70b llama 2 considerably faster.

Sign up or log in to comment