Just wanted to drop a quick "thank-you!"

#2
by Nafnlaus - opened

Thanks for training the TinyLlama models; they make for a great base for creating specialized fine-tuned ultrafast models that aren't encumbered by restrictive licenses :) They also happen to be a perfect size for finetuning on a 24GB consumer-grade card with a reasonable batch size.

Sign up or log in to comment