LLM Leaderboard Avg ⬆️ ARC HSwag MMLU TQA Winogrande GSM8K
🔶 Ramikan-BR/tinyllama-coder-py-v12 📑 35.22 32 53.61 26.31 40.91 57.06 1.44

Link para o resultado: Leaderboard

Uploaded model

  • Developed by: Ramikan-BR
  • License: apache-2.0
  • Finetuned from model : unsloth/tinyllama-chat-bnb-4bit

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
236
Safetensors
Model size
1.1B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Ramikan-BR/tinyllama-coder-py-v12

Quantized
(75)
this model

Dataset used to train Ramikan-BR/tinyllama-coder-py-v12

Space using Ramikan-BR/tinyllama-coder-py-v12 1