about 3090ti card
Thanks for your models, reading the stats it seems you are on a similar setup like mine, I've a 3090ti card (CPU is negligible, anyway I've a Ryzen 7 7700x)
Which model would you suggest to use on oobabooga for this setup?
I've actually installed your vicuna 1.1 and I'm going to try it, my goal is to try to code some kind of auto gpt (in c#) and make use of oobabooga API to communicate between the llm and the app and it will be hosted on my machine
Thanks
Hi there
Vicuna 1.1 is a very good and popular model. I'd definitely recommend trying that.
I haven't experimented with gpt4-alpaca too much yet. In theory it should be even better, but I can't say for certain. Because it's based on a 30B model that should definitely help in theory. But then I think Vicuna had particularly good training data, probably better than this model. So I don't know how those two factors will compare.
You should just be able GPT4-Alpaca in 24GB, so it's definitely worth trying.
At home I only have a 16GB AMD 6900XT, which I run in macOS. I do all my GPTQ conversions in the cloud, using first Google Colab and now Runpod. So I pay a little bit of money per hour to get access to NVidia GPUs, like the 3090 and 4090 with 24GB, and the A100 with 40GB or 80GB.
Eventually I plan to get a GPU at home as well but that will have to wait a few months until I can get the cash together! :)
Good luck with your project, hope it works out well.
If you want to come and discuss LLMs in more detail and talk about your development, there's a lot of LLM discussion in the Nomic.Ai/GPT4ALL Discord server: https://discord.gg/ZHaesTnb
Thanks for the discord link, I'll take a look and GL with the GPU plan, it's worth it :)