eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
49 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.83k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
21.6
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
147 values
Generation
int64
0
6
Base Model
stringlengths
4
102
princeton-nlp_Llama-3-Instruct-8B-KTO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
477d33ea62ed57a0429517170612aa1df21c78d6
24.344687
0
8
true
true
true
false
true
0.630536
0.729025
72.902454
0.507977
29.648406
0.080816
8.081571
0.260067
1.342282
0.37775
4.452083
0.366772
29.641327
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
princeton-nlp_Llama-3-Instruct-8B-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-ORPO
4bb3ffcf9ede48cb01a10bf3223eb41b59aa3fef
23.534475
0
8
true
true
true
false
true
0.623904
0.712813
71.281311
0.500121
28.839356
0.073263
7.326284
0.258389
1.118568
0.350188
3.240104
0.364611
29.401226
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-ORPO
princeton-nlp_Llama-3-Instruct-8B-ORPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
3ea5c542a3d8d61f6afb6cdbef5972a501ddf759
25.852852
0
8
true
true
true
false
true
0.594232
0.763321
76.332132
0.507835
29.604837
0.095166
9.516616
0.283557
4.474273
0.377969
4.846094
0.373088
30.343159
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RDPO
9497ca226a68981f42df2e5b3a4a1a2ea702a942
22.584117
0
8
true
true
true
false
true
0.56625
0.666002
66.600176
0.503363
29.032479
0.023414
2.34139
0.282718
4.362416
0.375208
4.201042
0.360705
28.967199
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-RDPO
princeton-nlp_Llama-3-Instruct-8B-RDPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
4e5bc9779cba3a2f615379d3f8ef1bbb3ea487f7
24.427995
1
8
true
true
true
false
true
0.557948
0.707692
70.769226
0.504922
28.854277
0.050604
5.060423
0.292785
5.704698
0.380448
5.35599
0.37741
30.82336
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RRHF
73561d9b0fd42b94250246f8d794251fe9f9d2e9
24.059318
0
8
true
true
true
false
true
0.639216
0.727451
72.745094
0.491055
27.216485
0.095166
9.516616
0.280201
4.026846
0.347552
1.477344
0.364362
29.373522
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-RRHF
princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
81191fbb214d17f0a4fec247da5d648f4cb61ef1
23.753751
0
8
true
true
true
false
true
0.505873
0.712488
71.248842
0.49839
28.498724
0.087613
8.761329
0.260067
1.342282
0.373781
5.089323
0.348238
27.582004
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
7e9001f6f4fe940c363bb7ea1814d33c79b21737
25.056382
0
8
true
true
true
false
true
0.725192
0.739966
73.996551
0.502942
29.211612
0.082326
8.232628
0.286074
4.809843
0.372292
5.369792
0.358461
28.717863
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
1821cc42189d8dab9e157c31b223dc60fc037c2d
23.728355
0
8
true
true
true
false
true
0.521239
0.710965
71.096468
0.49839
28.498724
0.087613
8.761329
0.260067
1.342282
0.373781
5.089323
0.348238
27.582004
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
princeton-nlp_Llama-3-Instruct-8B-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SimPO
f700cb6afb4509b10dea43ab72bb0e260e166be4
22.657116
55
8
true
true
true
false
true
0.533346
0.65039
65.038985
0.484468
26.709133
0.02568
2.567976
0.293624
5.816555
0.394833
8.154167
0.348903
27.655881
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-SimPO
princeton-nlp_Llama-3-Instruct-8B-SimPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
9ac0fbee445e7755e50520e9881d67588b4b854c
24.474601
5
8
true
true
true
false
true
0.579982
0.680865
68.086455
0.503834
29.214022
0.057402
5.740181
0.301174
6.823266
0.398802
7.85026
0.362201
29.133422
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-CPO
7f67394668b94a9ddfb64daff8976b48b135d96c
17.373794
0
7
true
true
true
false
true
0.809769
0.465493
46.549267
0.438215
21.857696
0.026435
2.643505
0.291946
5.592841
0.407083
9.252083
0.265126
18.34737
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-CPO
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-DPO
17134fd80cfbf3980353967a30dc6f450f18f78f
16.236325
0
7
true
true
true
false
true
0.66762
0.440338
44.03383
0.435011
20.79098
0.016616
1.661631
0.272651
3.020134
0.412229
9.628646
0.264545
18.282728
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-DPO
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-IPO
eea781724e4d2ab8bdda7c13526f042de4cfae41
17.210428
0
7
true
true
true
false
true
0.667334
0.482953
48.295301
0.445802
23.703491
0.024924
2.492447
0.280201
4.026846
0.377625
4.836458
0.279172
19.908023
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-IPO
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-KTO
02148bb9241b0f4bb0c75e93893eed005abe25e8
18.96264
0
7
true
true
true
false
true
0.666017
0.478482
47.848154
0.447643
23.107642
0.036254
3.625378
0.290268
5.369128
0.436781
13.03099
0.287151
20.794548
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-KTO
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RDPO
2a63a6d9e1978c99444e440371268f7c2b7e0375
16.465757
0
7
true
true
true
false
true
0.662505
0.460647
46.064664
0.443953
22.98201
0.020393
2.039275
0.277685
3.691275
0.357938
4.275521
0.277676
19.7418
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RDPO
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RRHF
0d5861072e9d01f420451bf6a5b108bc8d3a76bc
16.194613
0
7
true
true
true
false
true
0.669001
0.440663
44.0663
0.428059
19.598831
0.02568
2.567976
0.290268
5.369128
0.418677
10.034635
0.239777
15.530807
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RRHF
princeton-nlp_Mistral-7B-Base-SFT-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
65d2cc49ad05258da3d982b39682c7f672f5e4ab
18.955533
0
7
true
true
true
false
true
0.668442
0.512728
51.272845
0.44224
22.304723
0.032477
3.247734
0.291946
5.592841
0.426083
11.527083
0.278092
19.787973
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-SimPO
9d9e8b8de4f673d45bc826efc4a1444f9d480222
16.893545
0
7
true
true
true
false
true
0.635706
0.470064
47.006387
0.439805
22.332886
0.006042
0.60423
0.283557
4.474273
0.397063
8.032813
0.270196
18.910683
false
2024-05-17
2024-09-21
0
princeton-nlp/Mistral-7B-Base-SFT-SimPO
princeton-nlp_Mistral-7B-Instruct-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-CPO
32492f8e5588f06005689ac944c2ea39c394c28e
15.565535
0
7
true
true
true
false
true
0.645922
0.420305
42.030479
0.406922
17.248538
0.021903
2.190332
0.26594
2.12528
0.417844
10.897135
0.270113
18.901448
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-CPO
princeton-nlp_Mistral-7B-Instruct-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-DPO
5e96cff70d8db87cf17c616429c17c8dc9352543
16.549607
0
7
true
true
true
false
true
0.605267
0.517624
51.762435
0.406036
16.875389
0.030211
3.021148
0.268456
2.46085
0.383333
5.75
0.27485
19.427822
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-DPO
princeton-nlp_Mistral-7B-Instruct-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-IPO
32ad99c6e7231bbe8ebd9d24b28e084c60848558
17.707096
0
7
true
true
true
false
true
0.625748
0.49292
49.29199
0.432218
20.09411
0.019637
1.963746
0.27349
3.131991
0.432417
12.785417
0.270778
18.975325
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-IPO
princeton-nlp_Mistral-7B-Instruct-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-KTO
834422e5b9b9eee6aac2f8d4822b925a6574d628
16.664827
0
7
true
true
true
false
true
0.603378
0.490797
49.079664
0.413959
17.812648
0.024169
2.416918
0.27349
3.131991
0.395271
7.408854
0.28125
20.138889
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-KTO
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-ORPO
69c0481f4100629a49ae73f760ddbb61d8e98e48
16.050529
0
7
true
true
true
false
true
0.624297
0.471962
47.196217
0.410406
18.038373
0.02719
2.719033
0.274329
3.243848
0.39124
6.638281
0.266207
18.46742
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-ORPO
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RDPO
23ec6ab4f996134eb15c19322dabb34d7332d7cd
16.420491
0
7
true
true
true
false
true
0.610616
0.488723
48.872325
0.405015
17.048388
0.024169
2.416918
0.280201
4.026846
0.387333
6.416667
0.277676
19.7418
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RDPO
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RRHF
493d3ceb571232fe3b2f55c0bf78692760f4fc7e
16.829083
0
7
true
true
true
false
true
0.587751
0.496017
49.601723
0.418977
19.206552
0.024169
2.416918
0.276007
3.467562
0.397875
7.934375
0.265126
18.34737
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RRHF
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
3d08c8b7c3e73beb2a3264848f17246b74c3d162
16.376556
0
7
true
true
true
false
true
0.622453
0.511529
51.152941
0.404001
16.653429
0.016616
1.661631
0.272651
3.020134
0.391302
6.71276
0.271526
19.058437
false
2024-07-06
2024-10-16
0
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SimPO
03191ee1e60d21a698d11a515703a037073724f8
17.569551
1
7
false
true
true
false
true
0.570562
0.46869
46.868974
0.450723
22.382277
0.026435
2.643505
0.278523
3.803132
0.409781
9.75599
0.279671
19.963431
false
2024-05-24
2024-09-21
0
princeton-nlp/Mistral-7B-Instruct-SimPO
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-1.3B
a4b76938edbf571ea7d7d9904861cbdca08809b4
5.505397
apache-2.0
91
1
true
true
true
false
false
0.3546
0.21977
21.977021
0.319705
4.74463
0.008308
0.830816
0.239933
0
0.371302
3.579427
0.117104
1.900488
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-1.3B
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-2.7B
2f157a0306b75d37694ae05f6a4067220254d540
6.324627
apache-2.0
60
2
true
true
true
false
false
0.47005
0.241652
24.165215
0.325869
5.655521
0.006042
0.60423
0.275168
3.355705
0.356729
2.091146
0.118684
2.075946
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-2.7B
princeton-nlp_gemma-2-9b-it-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-DPO
f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22
19.434035
5
9
false
true
true
false
true
2.890627
0.276872
27.687203
0.594144
41.593654
0
0
0.33557
11.409396
0.382031
5.653906
0.37234
30.260047
false
2024-07-16
2024-09-19
2
google/gemma-2-9b
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-SimPO
8c87091f412e3aa6f74f66bd86c57fb81cbc3fde
21.161652
mit
123
9
true
true
true
false
true
2.769004
0.320686
32.068578
0.583918
40.09343
0
0
0.33557
11.409396
0.412323
10.340365
0.397523
33.058141
false
2024-07-16
2024-08-10
2
google/gemma-2-9b
pszemraj_Llama-3-6.3b-v0.1_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Llama-3-6.3b-v0.1
7000b39346162f95f19aa4ca3975242db61902d7
10.333954
llama3
6
6
true
true
true
false
false
0.814463
0.10439
10.438969
0.419681
18.679996
0.018127
1.812689
0.283557
4.474273
0.390833
6.154167
0.283993
20.443632
false
2024-05-17
2024-06-26
1
meta-llama/Meta-Llama-3-8B
pszemraj_Mistral-v0.3-6B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Mistral-v0.3-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Mistral-v0.3-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Mistral-v0.3-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Mistral-v0.3-6B
ae11a699012b83996361f04808f4d45debf3b01c
10.046851
apache-2.0
1
5
true
true
true
false
false
0.530539
0.245374
24.53745
0.377405
13.515091
0.009063
0.906344
0.265101
2.013423
0.390771
6.613021
0.214262
12.695774
false
2024-05-25
2024-06-26
2
pszemraj/Mistral-7B-v0.3-prune6 (Merge)
qingy2019_LLaMa_3.2_3B_Catalysts_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/LLaMa_3.2_3B_Catalysts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/LLaMa_3.2_3B_Catalysts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__LLaMa_3.2_3B_Catalysts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/LLaMa_3.2_3B_Catalysts
3f4a318114beb37f32a2c143cbd68b6d15d18164
19.628816
apache-2.0
1
3
true
true
true
false
false
0.649834
0.49924
49.923979
0.446813
21.345401
0.111027
11.102719
0.288591
5.145414
0.378771
7.946354
0.300781
22.309028
false
2024-10-19
2024-10-29
2
meta-llama/Llama-3.2-3B-Instruct
qq8933_OpenLongCoT-Base-Gemma2-2B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/qq8933/OpenLongCoT-Base-Gemma2-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qq8933/OpenLongCoT-Base-Gemma2-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qq8933__OpenLongCoT-Base-Gemma2-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qq8933/OpenLongCoT-Base-Gemma2-2B
39e5bc941f107ac28142c802aecfd257cc47c1bb
5.08291
other
8
3
true
true
true
false
true
1.658487
0.196514
19.651414
0.310636
3.546298
0
0
0.262584
1.677852
0.32225
2.114583
0.131566
3.507314
false
2024-10-28
2024-11-12
2
google/gemma-2-2b
rasyosef_Mistral-NeMo-Minitron-8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/Mistral-NeMo-Minitron-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Mistral-NeMo-Minitron-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Mistral-NeMo-Minitron-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/Mistral-NeMo-Minitron-8B-Chat
cede47eac8a4e65aa27567d3f087c28185b537d9
17.230946
other
8
8
true
true
true
false
true
1.476398
0.445184
44.518433
0.475944
26.036695
0.008308
0.830816
0.276007
3.467562
0.430427
12.936719
0.240359
15.595449
false
2024-08-26
2024-08-26
1
nvidia/Mistral-NeMo-Minitron-8B-Base
rasyosef_Phi-1_5-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/Phi-1_5-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Phi-1_5-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Phi-1_5-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/Phi-1_5-Instruct-v0.1
f4c405ee4bff5dc1a69383f3fe682342c9c87c77
6.638162
mit
0
1
true
true
true
false
true
0.295022
0.240228
24.022815
0.31179
4.820244
0
0
0.260067
1.342282
0.342156
3.402865
0.156167
6.240765
false
2024-07-24
2024-07-25
1
microsoft/phi-1_5
rasyosef_phi-2-instruct-apo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-apo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-apo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-apo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/phi-2-instruct-apo
2d3722d6db77a8c844a50dd32ddc4278fdc89e1f
12.043528
mit
0
2
true
true
true
false
true
0.495065
0.314592
31.459195
0.44451
21.672438
0
0
0.270134
2.684564
0.334219
3.610677
0.215509
12.834294
false
2024-09-15
2024-09-17
1
microsoft/phi-2
rasyosef_phi-2-instruct-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/phi-2-instruct-v0.1
29aeb3ccf7c79e0169a038fbd0deaf9772a9fefd
14.218631
mit
2
2
true
true
true
false
true
0.492726
0.368148
36.814763
0.472612
26.358802
0
0
0.274329
3.243848
0.352354
5.044271
0.224651
13.850103
false
2024-08-09
2024-08-10
1
microsoft/phi-2
realtreetune_rho-1b-sft-MATH_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/realtreetune/rho-1b-sft-MATH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">realtreetune/rho-1b-sft-MATH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/realtreetune__rho-1b-sft-MATH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
realtreetune/rho-1b-sft-MATH
b5f93df6af679a860caac9a9598e0f70c326b4fb
5.355177
0
1
false
true
true
false
false
0.278134
0.212102
21.210167
0.314415
4.197623
0.021903
2.190332
0.252517
0.33557
0.345844
2.897135
0.111702
1.300236
false
2024-06-06
2024-10-05
1
realtreetune/rho-1b-sft-MATH (Merge)
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
9048af8616bc62b6efab2bc1bc77ba53c5dfed79
29.873992
apache-2.0
3
10
true
true
true
false
true
2.114373
0.764895
76.489492
0.597439
42.25121
0.017372
1.73716
0.330537
10.738255
0.424479
12.393229
0.420711
35.634604
false
2024-09-11
2024-09-12
0
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
5a4f7299d9f8ea5faad2b1edc68b7bf634dac40b
23.205618
apache-2.0
3
10
true
true
true
false
false
2.969828
0.285365
28.536505
0.598393
42.703798
0.058157
5.81571
0.329698
10.626398
0.460656
16.415365
0.416223
35.135934
false
2024-09-11
2024-09-27
0
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
recoilme_recoilme-gemma-2-9B-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.1
6dc0997046db4e9932f87d338ecdc2a4158abbda
29.602746
0
10
false
true
true
false
true
1.924809
0.751506
75.1506
0.599531
42.321861
0.016616
1.661631
0.338926
11.856823
0.419146
11.526563
0.415891
35.098995
false
2024-09-18
0
Removed
recoilme_recoilme-gemma-2-9B-v0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.2
483116e575fb3a56de25243b14d715c58fe127bc
30.048864
cc-by-nc-4.0
1
10
true
true
true
false
true
1.914086
0.759175
75.917455
0.602596
43.027969
0.05287
5.287009
0.328859
10.514541
0.409875
10.401042
0.416307
35.145168
false
2024-09-18
2024-09-18
0
recoilme/recoilme-gemma-2-9B-v0.2
recoilme_recoilme-gemma-2-9B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.2
483116e575fb3a56de25243b14d715c58fe127bc
23.674735
cc-by-nc-4.0
1
10
true
true
true
false
false
2.946784
0.274699
27.469891
0.603083
43.560581
0.077795
7.779456
0.330537
10.738255
0.468594
17.807552
0.412234
34.692671
false
2024-09-18
2024-09-27
0
recoilme/recoilme-gemma-2-9B-v0.2
recoilme_recoilme-gemma-2-9B-v0.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.3
772cab46d9d22cbcc3c574d193021803ce5c444c
30.207472
cc-by-nc-4.0
3
10
true
true
true
false
true
1.876637
0.743937
74.39372
0.599253
42.026279
0.087613
8.761329
0.323826
9.8434
0.420385
12.08151
0.407247
34.138593
false
2024-09-18
2024-09-18
0
recoilme/recoilme-gemma-2-9B-v0.3
recoilme_recoilme-gemma-2-9B-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.3
76c8fb761660e6eb237c91bb6e6761ee36266bba
30.111638
cc-by-nc-4.0
3
10
true
true
true
false
false
2.55535
0.576076
57.607592
0.601983
43.326868
0.172961
17.296073
0.337248
11.63311
0.463229
17.036979
0.403923
33.769208
false
2024-09-18
2024-09-27
0
recoilme/recoilme-gemma-2-9B-v0.3
recoilme_recoilme-gemma-2-9B-v0.4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.4
2691f2cc8d80072f15d78cb7ae72831e1a12139e
24.100363
cc-by-nc-4.0
2
10
true
true
true
false
false
2.91891
0.256189
25.618913
0.596729
42.442482
0.082326
8.232628
0.340604
12.080537
0.472688
18.385938
0.440575
37.841681
false
2024-09-18
2024-09-19
0
recoilme/recoilme-gemma-2-9B-v0.4
refuelai_Llama-3-Refueled_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/refuelai/Llama-3-Refueled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">refuelai/Llama-3-Refueled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/refuelai__Llama-3-Refueled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
refuelai/Llama-3-Refueled
ff6d1c3ba37b31d4af421951c2300f2256fb3691
22.803805
cc-by-nc-4.0
188
8
true
true
true
false
true
0.875986
0.461995
46.199528
0.587077
41.721971
0.043807
4.380665
0.299497
6.599553
0.445406
14.642448
0.309508
23.278664
true
2024-05-03
2024-06-12
0
refuelai/Llama-3-Refueled
rhplus0831_maid-yuzu-v7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/rhplus0831/maid-yuzu-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhplus0831/maid-yuzu-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhplus0831__maid-yuzu-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rhplus0831/maid-yuzu-v7
a0bd8c707bb80024778da4a0d057917faa53d2f6
24.48193
1
46
false
true
true
false
true
4.104285
0.646243
64.624308
0.480492
26.819837
0.095166
9.516616
0.309564
7.941834
0.413625
9.769792
0.353973
28.219193
false
2024-02-09
2024-09-08
1
rhplus0831/maid-yuzu-v7 (Merge)
rhymes-ai_Aria_bfloat16
bfloat16
🌸 multimodal
🌸
Original
AriaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/rhymes-ai/Aria" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhymes-ai/Aria</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhymes-ai__Aria-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rhymes-ai/Aria
5cc2703b3afd585f232ec5027e9c039a2001bcec
28.354051
apache-2.0
583
25
true
true
false
false
true
7.75071
0.477308
47.730799
0.569531
39.281493
0.162387
16.238671
0.362416
14.988814
0.43375
14.052083
0.440492
37.832447
true
2024-09-26
2024-10-10
0
rhymes-ai/Aria
rhysjones_phi-2-orange-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rhysjones/phi-2-orange-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhysjones/phi-2-orange-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhysjones__phi-2-orange-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rhysjones/phi-2-orange-v2
f4085189114accfb65225deb8fbdf15767b7ee56
14.644427
mit
27
2
true
true
true
false
true
0.470949
0.366974
36.697407
0.477022
25.606549
0
0
0.261745
1.565996
0.362958
6.969792
0.253241
17.026817
false
2024-03-04
2024-06-28
0
rhysjones/phi-2-orange-v2
riaz_FineLlama-3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/riaz/FineLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">riaz/FineLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/riaz__FineLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
riaz/FineLlama-3.1-8B
c4d8f16eb446910edce0c1afd0e6d5f3b06e2e7d
17.610296
apache-2.0
1
8
true
true
true
false
true
0.921092
0.437341
43.73407
0.458573
24.148778
0.048338
4.833837
0.275168
3.355705
0.376292
7.769792
0.296376
21.819592
false
2024-10-07
2024-10-12
2
meta-llama/Meta-Llama-3.1-8B
riaz_FineLlama-3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/riaz/FineLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">riaz/FineLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/riaz__FineLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
riaz/FineLlama-3.1-8B
c4d8f16eb446910edce0c1afd0e6d5f3b06e2e7d
17.147511
apache-2.0
1
8
true
true
true
false
true
0.901998
0.41366
41.36602
0.456452
23.77339
0.045317
4.531722
0.276007
3.467562
0.377625
7.769792
0.297789
21.976581
false
2024-10-07
2024-10-12
2
meta-llama/Meta-Llama-3.1-8B
rmdhirr_Gluon-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/rmdhirr/Gluon-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rmdhirr/Gluon-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rmdhirr__Gluon-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rmdhirr/Gluon-8B
cc949908c60ab7f696e133714222d6cab156e493
23.951787
apache-2.0
1
8
true
false
true
false
false
0.903078
0.505285
50.528487
0.515331
30.342247
0.142749
14.274924
0.312081
8.277405
0.403885
9.085677
0.380818
31.20198
false
2024-09-14
2024-09-14
1
rmdhirr/Gluon-8B (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-0.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-0.5b
aae2e55548c8090ce357c64ca78e8b9ef6baf118
8.71875
apache-2.0
3
0
true
true
true
false
false
0.645707
0.284667
28.466691
0.329368
8.412219
0.027946
2.794562
0.266779
2.237136
0.323583
0.78125
0.186586
9.620641
false
2024-10-06
2024-09-29
1
rombodawg/Rombos-LLM-V2.5-Qwen-0.5b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-1.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-1.5b
1f634da015ed671efe7dc574bc2a1954f5b2cc93
16.165564
apache-2.0
2
1
true
true
true
false
false
0.740358
0.340246
34.02461
0.42567
18.711344
0.074018
7.401813
0.288591
5.145414
0.418552
10.352344
0.292221
21.357861
false
2024-10-06
2024-09-29
1
rombodawg/Rombos-LLM-V2.5-Qwen-1.5b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-14b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-14b
834ddb1712ae6d1b232b2d5b26be658d90d23e43
34.73006
apache-2.0
5
14
true
true
true
false
false
2.1827
0.584045
58.404478
0.648109
49.3869
0.169184
16.918429
0.371644
16.219239
0.471729
18.832812
0.537566
48.618499
false
2024-10-06
2024-09-29
1
rombodawg/Rombos-LLM-V2.5-Qwen-14b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-32b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-32b
234abe4b494dbe83ba805b791f74feb33462a33d
44.5742
apache-2.0
24
32
true
true
true
false
false
17.91269
0.682663
68.266311
0.704554
58.261894
0.41994
41.993958
0.396812
19.574944
0.503417
24.727083
0.591589
54.621011
false
2024-09-30
2024-10-07
1
rombodawg/Rombos-LLM-V2.5-Qwen-32b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-3b
26601a8da5afce3b5959d91bdd0faaab6df8bf95
22.183111
other
2
3
true
true
true
false
false
1.005794
0.534236
53.423583
0.48089
27.213597
0.055136
5.513595
0.307886
7.718121
0.404167
8.554167
0.37608
30.675606
false
2024-10-06
2024-09-29
1
rombodawg/Rombos-LLM-V2.5-Qwen-3b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-72b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-72b
5260f182e7859e13d515c4cb3926ac85ad057504
45.909246
other
23
72
true
true
true
false
false
16.033946
0.715536
71.553589
0.722959
61.267145
0.506798
50.679758
0.39849
19.798658
0.459917
17.322917
0.593501
54.833407
false
2024-09-30
2024-09-30
1
rombodawg/Rombos-LLM-V2.5-Qwen-72b (Merge)
rombodawg_Rombos-LLM-V2.5-Qwen-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5-Qwen-7b
dbd819e8f765181f774cb5b79812d081669eb302
31.112348
apache-2.0
14
7
true
true
true
false
false
1.317084
0.623712
62.371175
0.554389
36.37235
0.283233
28.323263
0.317953
9.060403
0.429094
12.003385
0.446892
38.543514
false
2024-10-06
2024-09-29
1
rombodawg/Rombos-LLM-V2.5-Qwen-7b (Merge)
rombodawg_Rombos-LLM-V2.5.1-Qwen-3b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5.1-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5.1-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5.1-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b
a3305ce148f4273ab334052ab47d3aebb51d104c
13.357125
other
1
3
true
false
true
false
false
0.929244
0.259513
25.951254
0.388404
14.881409
0.09139
9.138973
0.274329
3.243848
0.399115
7.822656
0.271941
19.10461
false
2024-10-08
2024-10-08
1
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b (Merge)
rombodawg_Rombos-LLM-V2.5.1-Qwen-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5.1-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5.1-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5.1-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b
b65848c13b31f5b9d5d953df95d504d195082a3b
13.130247
other
1
3
true
false
true
false
false
1.954031
0.25664
25.664016
0.390008
15.057744
0.092145
9.214502
0.262584
1.677852
0.399115
7.822656
0.274102
19.34471
false
2024-10-08
2024-11-14
1
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b (Merge)
rombodawg_Rombos-LLM-V2.6-Nemotron-70b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.6-Nemotron-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.6-Nemotron-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.6-Nemotron-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.6-Nemotron-70b
951c9cdf68d6e679c78625d1a1f396eb71cdf746
41.933642
llama3.1
2
70
true
true
true
false
false
11.950774
0.752655
75.265518
0.69377
55.805573
0.332326
33.232628
0.40604
20.805369
0.466865
18.391406
0.532912
48.101359
false
2024-10-17
2024-10-17
0
rombodawg/Rombos-LLM-V2.6-Nemotron-70b
rombodawg_Rombos-LLM-V2.6-Qwen-14b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.6-Qwen-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.6-Qwen-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.6-Qwen-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/Rombos-LLM-V2.6-Qwen-14b
887910d75a1837b8b8c7c3e50a257517d286ec60
36.353495
apache-2.0
42
14
true
true
true
false
false
2.179929
0.521446
52.144643
0.648205
49.217784
0.316465
31.646526
0.377517
17.002237
0.47675
19.260417
0.539644
48.849365
false
2024-10-12
2024-10-13
1
rombodawg/Rombos-LLM-V2.6-Qwen-14b (Merge)
rombodawg_rombos_Replete-Coder-Instruct-8b-Merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/rombos_Replete-Coder-Instruct-8b-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/rombos_Replete-Coder-Instruct-8b-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__rombos_Replete-Coder-Instruct-8b-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/rombos_Replete-Coder-Instruct-8b-Merged
85ad1fb943d73866ba5c8dcfe4a4f2cbfba12d4d
16.433824
apache-2.0
1
8
true
true
true
false
true
0.964128
0.538757
53.875716
0.446169
21.937707
0.077795
7.779456
0.269295
2.572707
0.366031
3.453906
0.180851
8.983452
false
2024-10-06
2024-10-14
0
rombodawg/rombos_Replete-Coder-Instruct-8b-Merged
rombodawg_rombos_Replete-Coder-Llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/rombodawg/rombos_Replete-Coder-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/rombos_Replete-Coder-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__rombos_Replete-Coder-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rombodawg/rombos_Replete-Coder-Llama3-8B
938a45789cf94821ef6b12c98dc76622a0fa936a
11.832564
other
2
8
true
true
true
false
true
1.205602
0.471413
47.141252
0.327628
7.087845
0.030967
3.096677
0.266779
2.237136
0.396635
7.71276
0.133477
3.71971
false
2024-10-06
2024-10-14
0
rombodawg/rombos_Replete-Coder-Llama3-8B
rwitz_go-bruins-v2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/rwitz/go-bruins-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rwitz/go-bruins-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rwitz__go-bruins-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rwitz/go-bruins-v2
6d9e57d3a36dbad364ec77ca642873d9fc7fd61c
15.421379
0
7
false
true
true
false
true
0.63782
0.409589
40.958878
0.379884
12.69326
0.066465
6.646526
0.262584
1.677852
0.41375
10.985417
0.276097
19.566342
false
2024-06-26
0
Removed
saishf_Fimbulvetr-Kuro-Lotus-10.7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Fimbulvetr-Kuro-Lotus-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishf__Fimbulvetr-Kuro-Lotus-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Fimbulvetr-Kuro-Lotus-10.7B
ec1288fd8c06ac408a2a7e503ea62ac300e474e1
20.023285
cc-by-nc-4.0
17
10
true
false
true
false
true
0.809169
0.493944
49.394385
0.434232
19.908821
0.01435
1.435045
0.301174
6.823266
0.44451
16.030469
0.33893
26.547725
false
2024-02-13
2024-07-09
1
saishf/Fimbulvetr-Kuro-Lotus-10.7B (Merge)
sakhan10_quantized_open_llama_3b_v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sakhan10/quantized_open_llama_3b_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sakhan10/quantized_open_llama_3b_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sakhan10__quantized_open_llama_3b_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sakhan10/quantized_open_llama_3b_v2
e8d51ad5204806edf9c2eeb8c56139a440a70265
5.1425
0
3
false
true
true
false
false
0.3927
0.187222
18.722213
0.30198
2.805733
0
0
0.276846
3.579418
0.368167
4.6875
0.109541
1.060136
false
2024-08-23
2024-08-28
1
openlm-research/open_llama_3b_v2
saltlux_luxia-21.4b-alignment-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saltlux__luxia-21.4b-alignment-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.0
87d5673e6d9f60462f195e9414a0bf6874c89ceb
22.925873
apache-2.0
32
21
true
true
true
false
true
1.744047
0.369297
36.92968
0.637334
48.021113
0.06571
6.570997
0.301174
6.823266
0.432844
12.505469
0.340342
26.704713
false
2024-03-12
2024-06-29
0
saltlux/luxia-21.4b-alignment-v1.0
saltlux_luxia-21.4b-alignment-v1.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saltlux__luxia-21.4b-alignment-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.2
eed12b5574fa49cc81e57a88aff24c08c13721c0
23.435192
apache-2.0
8
21
true
true
true
false
true
2.045926
0.411537
41.153694
0.637118
47.769165
0.015861
1.586103
0.307886
7.718121
0.445896
14.903646
0.347324
27.480423
false
2024-05-27
2024-07-30
0
saltlux/luxia-21.4b-alignment-v1.2
sam-paech_Darkest-muse-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/sam-paech/Darkest-muse-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Darkest-muse-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Darkest-muse-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sam-paech/Darkest-muse-v1
55f6ba0218e9615d18a76f244a874b941f8c434f
31.810869
apache-2.0
16
10
true
true
true
false
false
2.206947
0.73442
73.442023
0.596844
42.611731
0.116314
11.63142
0.34396
12.527964
0.450208
15.276042
0.418384
35.376034
false
2024-10-22
2024-10-26
1
sam-paech/Darkest-muse-v1 (Merge)
sam-paech_Delirium-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/sam-paech/Delirium-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Delirium-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Delirium-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sam-paech/Delirium-v1
98dc2dad47af405013c0584d752504ca448bd8eb
31.732318
gemma
7
9
true
true
true
false
false
2.395501
0.720756
72.075648
0.596211
42.315079
0.129154
12.915408
0.343121
12.416107
0.451448
15.23099
0.418966
35.440677
false
2024-10-17
2024-10-26
1
unsloth/gemma-2-9b-it
sam-paech_Quill-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/sam-paech/Quill-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Quill-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Quill-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sam-paech/Quill-v1
3cab1cac9d3de0d25b48ea86b4533aa220231f20
31.503021
4
9
false
true
true
false
false
2.313469
0.712214
71.221359
0.596923
42.597669
0.11858
11.858006
0.339765
11.96868
0.455479
16.134896
0.417138
35.237515
false
2024-10-20
2024-10-26
1
sam-paech/Quill-v1 (Merge)
schnapss_testmerge-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/schnapss/testmerge-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">schnapss/testmerge-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/schnapss__testmerge-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
schnapss/testmerge-7b
ff84f5b87ba51db9622b1c553c076533890a8f50
20.913446
0
7
false
true
true
false
false
0.470155
0.392228
39.222818
0.518748
32.638166
0.068731
6.873112
0.296141
6.152125
0.468563
17.703646
0.306017
22.89081
false
2024-11-16
2024-11-16
1
schnapss/testmerge-7b (Merge)
sci-m-wang_Mistral-7B-Instruct-sa-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sci-m-wang/Mistral-7B-Instruct-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/Mistral-7B-Instruct-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__Mistral-7B-Instruct-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sci-m-wang/Mistral-7B-Instruct-sa-v0.1
2dcff66eac0c01dc50e4c41eea959968232187fe
12.200064
other
0
14
true
true
true
false
true
0.765082
0.433519
43.351862
0.327278
5.743646
0.010574
1.057402
0.259228
1.230425
0.39
6.683333
0.236203
15.133717
false
2024-05-31
2024-06-27
2
mistralai/Mistral-7B-v0.1
sci-m-wang_Phi-3-mini-4k-instruct-sa-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__Phi-3-mini-4k-instruct-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1
5a516f86087853f9d560c95eb9209c1d4ed9ff69
25.773792
other
0
7
true
true
true
false
true
1.280503
0.502062
50.206231
0.550204
36.605419
0.145015
14.501511
0.328859
10.514541
0.407302
9.646094
0.398521
33.168957
false
2024-06-01
2024-06-27
1
microsoft/Phi-3-mini-4k-instruct
sci-m-wang_deepseek-llm-7b-chat-sa-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sci-m-wang/deepseek-llm-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/deepseek-llm-7b-chat-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__deepseek-llm-7b-chat-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sci-m-wang/deepseek-llm-7b-chat-sa-v0.1
afbda8b347ec881666061fa67447046fc5164ec8
13.119933
other
0
7
true
true
true
false
true
0.991574
0.403594
40.359358
0.371772
12.051975
0.021148
2.114804
0.256711
0.894855
0.417313
9.864062
0.220911
13.434545
false
2024-05-31
2024-06-27
1
deepseek-ai/deepseek-llm-7b-chat
senseable_WestLake-7B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/WestLake-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/WestLake-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/senseable__WestLake-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/WestLake-7B-v2
41625004c47628837678859753b94c50c82f3bec
16.332594
apache-2.0
109
7
true
true
true
false
true
0.631011
0.441862
44.186204
0.407328
17.858142
0.05287
5.287009
0.276846
3.579418
0.393719
7.48151
0.27643
19.60328
false
2024-01-22
2024-07-23
0
senseable/WestLake-7B-v2
sequelbox_Llama3.1-8B-MOTH_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-MOTH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-MOTH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-MOTH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama3.1-8B-MOTH
8db363e36b1efc9015ab14648e68bcfba9e8d8a0
20.685446
llama3.1
1
8
true
true
true
false
true
1.95448
0.524494
52.44939
0.490247
27.916332
0.112538
11.253776
0.268456
2.46085
0.368917
4.047917
0.33386
25.984412
false
2024-09-01
2024-09-19
2
meta-llama/Meta-Llama-3.1-8B
sequelbox_Llama3.1-8B-PlumChat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama3.1-8B-PlumChat
1afdc9856591f573e4fcb52dba19a9d8da631e0b
13.15179
llama3.1
0
8
true
false
true
false
true
0.989257
0.424276
42.427648
0.387329
13.935991
0.032477
3.247734
0.265101
2.013423
0.375458
4.765625
0.212683
12.520316
false
2024-10-02
2024-10-03
1
sequelbox/Llama3.1-8B-PlumChat (Merge)
sequelbox_Llama3.1-8B-PlumCode_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumCode" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumCode</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumCode-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama3.1-8B-PlumCode
171cd599d574000607491f08e6cf7b7eb199e33d
9.811412
llama3.1
0
8
true
false
true
false
false
0.890676
0.204483
20.448299
0.336809
8.502927
0.026435
2.643505
0.276007
3.467562
0.377344
8.967969
0.233544
14.838209
false
2024-10-02
2024-10-03
1
sequelbox/Llama3.1-8B-PlumCode (Merge)
sequelbox_Llama3.1-8B-PlumMath_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumMath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumMath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumMath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama3.1-8B-PlumMath
b857c30a626f7c020fcba89df7bece4bb7381ac2
13.886333
llama3.1
1
8
true
false
true
false
false
0.868772
0.224242
22.424168
0.40323
16.446584
0.044562
4.456193
0.317953
9.060403
0.391854
8.981771
0.29754
21.948877
false
2024-10-01
2024-10-03
1
sequelbox/Llama3.1-8B-PlumMath (Merge)
sequelbox_gemma-2-9B-MOTH_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/gemma-2-9B-MOTH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/gemma-2-9B-MOTH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__gemma-2-9B-MOTH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/gemma-2-9B-MOTH
8dff98ab82ba0087706afa0d6c69874a45548212
4.553324
gemma
0
9
true
true
true
false
true
3.027949
0.205882
20.588151
0.30797
3.212217
0
0
0.260067
1.342282
0.340948
0.61849
0.114029
1.558806
false
2024-09-09
2024-09-10
2
google/gemma-2-9b
sethuiyer_Qwen2.5-7B-Anvita_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Qwen2.5-7B-Anvita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Qwen2.5-7B-Anvita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Qwen2.5-7B-Anvita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Qwen2.5-7B-Anvita
dc6f8ca6507cc282938e70b23b02c1a3db7b7ddc
29.180839
apache-2.0
0
7
true
true
true
false
true
1.080123
0.648042
64.804164
0.546586
35.482448
0.15861
15.861027
0.327181
10.290828
0.433656
13.473698
0.416556
35.172872
false
2024-10-11
2024-10-27
1
sethuiyer/Qwen2.5-7B-Anvita (Merge)
shadowml_BeagSake-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/shadowml/BeagSake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shadowml/BeagSake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shadowml__BeagSake-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shadowml/BeagSake-7B
b7a3b25a188a4608fd05fc4247ddd504c1f529d1
19.063698
cc-by-nc-4.0
1
7
true
false
true
false
true
2.880128
0.521596
52.159603
0.471103
25.192945
0.054381
5.438066
0.28104
4.138702
0.412354
9.844271
0.258477
17.608599
false
2024-01-31
2024-10-29
1
shadowml/BeagSake-7B (Merge)
shadowml_Mixolar-4x7b_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/shadowml/Mixolar-4x7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shadowml/Mixolar-4x7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shadowml__Mixolar-4x7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shadowml/Mixolar-4x7b
bb793526b063765e9861cad8834160fb0945e66d
19.283412
apache-2.0
3
36
true
false
false
false
false
2.354728
0.38933
38.933031
0.521595
32.728964
0
0
0.292785
5.704698
0.42575
12.71875
0.330535
25.615027
false
2023-12-30
2024-08-05
0
shadowml/Mixolar-4x7b
shastraai_Shastra-LLAMA2-Math-Commonsense-SFT_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/shastraai/Shastra-LLAMA2-Math-Commonsense-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shastraai/Shastra-LLAMA2-Math-Commonsense-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shastraai__Shastra-LLAMA2-Math-Commonsense-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shastraai/Shastra-LLAMA2-Math-Commonsense-SFT
97a578246d4edecb5fde3dae262a64e4ec9f489a
10.503347
0
6
false
true
true
false
false
0.764042
0.304151
30.415076
0.384317
13.659523
0.018127
1.812689
0.259228
1.230425
0.360448
4.822656
0.199717
11.079713
false
2024-10-27
0
Removed
shivam9980_NEPALI-LLM_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/shivam9980/NEPALI-LLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shivam9980/NEPALI-LLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shivam9980__NEPALI-LLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shivam9980/NEPALI-LLM
5fe146065b53bfd6d8e242cffbe9176bc245551d
6.892789
apache-2.0
0
10
true
true
true
false
false
9.628949
0.041666
4.166611
0.382846
13.125244
0.006798
0.679758
0.261745
1.565996
0.412198
9.991406
0.206449
11.827719
false
2024-09-17
2024-09-24
1
unsloth/gemma-2-9b-bnb-4bit
shivam9980_mistral-7b-news-cnn-merged_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/shivam9980/mistral-7b-news-cnn-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shivam9980/mistral-7b-news-cnn-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shivam9980__mistral-7b-news-cnn-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shivam9980/mistral-7b-news-cnn-merged
a0d7029cb00c122843aef3d7ad61d514de334ea3
17.120747
apache-2.0
0
7
true
true
true
false
true
1.594093
0.463419
46.341928
0.363548
11.146536
0.01435
1.435045
0.308725
7.829978
0.45226
15.665885
0.282746
20.305112
false
2024-03-18
2024-09-12
1
unsloth/mistral-7b-instruct-v0.2-bnb-4bit
shyamieee_Padma-v7.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/shyamieee/Padma-v7.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shyamieee/Padma-v7.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shyamieee__Padma-v7.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shyamieee/Padma-v7.0
caf70bd6e2f819cc6a18dda8516f2cbdc101fdde
19.756218
apache-2.0
0
7
true
false
true
false
false
0.589899
0.38411
38.410972
0.511879
31.657521
0.070242
7.024169
0.286074
4.809843
0.438552
14.085677
0.302942
22.549128
false
2024-06-26
2024-06-26
1
shyamieee/Padma-v7.0 (Merge)
silma-ai_SILMA-9B-Instruct-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/silma-ai/SILMA-9B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">silma-ai/SILMA-9B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/silma-ai__SILMA-9B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
silma-ai/SILMA-9B-Instruct-v1.0
25d7b116ab3fb9f97417a297f8df4a7e34e7de68
24.369442
gemma
45
9
true
true
true
false
true
1.245999
0.584194
58.419438
0.521902
30.713003
0
0
0.305369
7.38255
0.463698
17.26224
0.391955
32.439421
false
2024-08-17
2024-11-12
0
silma-ai/SILMA-9B-Instruct-v1.0
skymizer_Llama2-7b-sft-chat-custom-template-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/skymizer/Llama2-7b-sft-chat-custom-template-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">skymizer/Llama2-7b-sft-chat-custom-template-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/skymizer__Llama2-7b-sft-chat-custom-template-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
skymizer/Llama2-7b-sft-chat-custom-template-dpo
22302ebd8c551a5f302fcb8366cc61fdeedf0e00
10.090196
llama2
0
6
true
true
true
false
false
0.61647
0.235282
23.528238
0.368847
11.238865
0.011329
1.132931
0.239094
0
0.442865
14.12474
0.194648
10.516401
false
2024-06-11
2024-07-01
1
Removed
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415
467eff1ac1c3395c130929bbe1f34a8194715e7c
8.826874
apache-2.0
0
7
true
true
true
false
true
1.627712
0.289338
28.933785
0.380418
12.789212
0.007553
0.755287
0.246644
0
0.386063
6.024479
0.140126
4.458481
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205
467eff1ac1c3395c130929bbe1f34a8194715e7c
12.818811
apache-2.0
0
7
true
true
true
false
true
1.588998
0.319938
31.993777
0.395862
16.710725
0.001511
0.151057
0.276007
3.467562
0.427177
12.097135
0.212434
12.492612
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522
467eff1ac1c3395c130929bbe1f34a8194715e7c
13.437097
apache-2.0
0
7
true
true
true
false
true
1.614698
0.376441
37.644118
0.382837
14.138282
0.009819
0.981873
0.265101
2.013423
0.440417
14.11875
0.205535
11.726138
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbc-213steps_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbc-213steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps
4ae2af48b6ac53f14e153b91309624100ae3d7c2
15.790852
apache-2.0
0
7
true
true
true
false
true
0.69881
0.427549
42.75489
0.419729
19.669907
0.021903
2.190332
0.261745
1.565996
0.408635
9.579427
0.270861
18.98456
false
2024-10-02
2024-10-03
0
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps