Text Generation
Transformers
Safetensors
mistral
medical
text-generation-inference
Inference Endpoints
Edit model card

This is an ongoing experiment in training and retraining boundaries.

The model is currently overtrained and is purposely so to investigate the paths out of overtraining.

This is purely an experiment on depths and depravity of repetitive training. Don't bother messing around with it much.

Downloads last month
18
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train jtatman/tinymistral-mediqa-248m