Base model Locutusque/TinyMistral-248M fully fine-tuned on Locutusque/InstructMix. During validation, this model achieved an average perplexity of 3.23 on Locutusque/InstructMix dataset. It has so far been trained on approximately 608,000 examples. More epochs are planned for this model.

Downloads last month
28
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Locutusque/TinyMistral-248M-Instruct

Finetuned
(3)
this model
Adapters
1 model
Finetunes
1 model
Merges
1 model

Datasets used to train Locutusque/TinyMistral-248M-Instruct