Edit model card

LogoS-7Bx2-MoE-13B-v0.1

Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.

The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 77.14
AI2 Reasoning Challenge (25-Shot) 74.49
HellaSwag (10-Shot) 89.07
MMLU (5-Shot) 64.74
TruthfulQA (0-shot) 74.57
Winogrande (5-shot) 88.32
GSM8k (5-shot) 71.65
Downloads last month
1,197
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Merge of

Evaluation results