This is the 110M parameter Llama 2 architecture model trained on the TinyStories dataset. These are converted from karpathy/tinyllamas. See the llama2.c project for more details.

Downloads last month
3,161
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for nickypro/tinyllama-110M

Finetunes
1 model
Quantizations
3 models