micro, small, ... medium?

#3
by lucyknada - opened

Hi there! amazing little model, do you have plans to scale this to 32b? thanks

Oxygen org

Hi ! yes it's planned but I'm going to need more compute resources!

that's awesome news, thanks! re:compute; are you able to secure that? also is there any plans to release the datasets? thanks!

Oxygen org

Hello, Unfortunately the datasets will remain private! We are currently looking for partners to finance the training of the medium version (probably qwen 32B or Llama 3.3 70B I'm not sure yet)

both would be interesting to see, though 70b is generally out of reach for most, gotta know who you're targeting (16-24GB average vs 48GB niche), but glad to hear there's talks to get it trained! thanks

Oxygen org

Yes I know lol I think I'm going to go on QwQ 32B sooner, But I'm still looking for partners who would like to finance the training, I'm going to wait until the Christmas holidays to take care of all that quietly outside of class!

Sign up or log in to comment