eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 49
values | Model
stringlengths 355
650
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.41
51.2
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.83k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | Not_Merged
bool 2
classes | MoE
bool 2
classes | Flagged
bool 1
class | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
107
| IFEval Raw
float64 0
0.87
| IFEval
float64 0
86.7
| BBH Raw
float64 0.28
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.41
| GPQA
float64 0
21.6
| MUSR Raw
float64 0.29
0.59
| MUSR
float64 0
36.4
| MMLU-PRO Raw
float64 0.1
0.7
| MMLU-PRO
float64 0
66.8
| Maintainer's Highlight
bool 2
classes | Upload To Hub Date
stringlengths 0
10
| Submission Date
stringclasses 147
values | Generation
int64 0
6
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
vicgalle_ConfigurableBeagle-11B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableBeagle-11B | bbc16dbf94b8e8a99bb3e2ada6755faf9c2990dd | 22.635544 | apache-2.0 | 3 | 10 | true | true | true | false | true | 0.879857 | 0.583445 | 58.344526 | 0.528659 | 32.392023 | 0.043807 | 4.380665 | 0.302013 | 6.935123 | 0.395302 | 7.379427 | 0.337434 | 26.381501 | false | 2024-02-17 | 2024-06-26 | 0 | vicgalle/ConfigurableBeagle-11B |
vicgalle_ConfigurableHermes-7B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableHermes-7B | 1333a88eaf6591836b2d9825d1eaec7260f336c9 | 19.536295 | apache-2.0 | 3 | 7 | true | true | true | false | true | 0.617282 | 0.54108 | 54.107989 | 0.457297 | 23.158164 | 0.047583 | 4.758308 | 0.276846 | 3.579418 | 0.405688 | 9.110938 | 0.302527 | 22.502955 | false | 2024-02-17 | 2024-06-26 | 0 | vicgalle/ConfigurableHermes-7B |
vicgalle_ConfigurableSOLAR-10.7B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableSOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableSOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableSOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableSOLAR-10.7B | 9d9baad88ea9dbaa61881f15e4f0d16e931033b4 | 19.045696 | apache-2.0 | 2 | 10 | true | true | true | false | true | 0.677681 | 0.509956 | 50.995581 | 0.486681 | 27.45095 | 0 | 0 | 0.298658 | 6.487696 | 0.380479 | 5.193229 | 0.31732 | 24.14672 | false | 2024-03-10 | 2024-06-26 | 0 | vicgalle/ConfigurableSOLAR-10.7B |
vicgalle_Humanish-RP-Llama-3.1-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Humanish-RP-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Humanish-RP-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Humanish-RP-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Humanish-RP-Llama-3.1-8B | d27aa731db1d390a8d17b0a4565c9231ee5ae8b9 | 25.347671 | apache-2.0 | 6 | 8 | true | true | true | false | true | 0.753451 | 0.666926 | 66.692598 | 0.510039 | 29.95856 | 0.147281 | 14.728097 | 0.286913 | 4.9217 | 0.395208 | 8.267708 | 0.347656 | 27.517361 | false | 2024-08-03 | 2024-08-03 | 0 | vicgalle/Humanish-RP-Llama-3.1-8B |
vicgalle_Merge-Mistral-Prometheus-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mistral-Prometheus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mistral-Prometheus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mistral-Prometheus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Merge-Mistral-Prometheus-7B | a7083581b508ce83c74f9267f07024bd462e7161 | 16.574054 | apache-2.0 | 1 | 7 | true | false | true | false | true | 0.630356 | 0.484801 | 48.480144 | 0.42014 | 18.410406 | 0.017372 | 1.73716 | 0.263423 | 1.789709 | 0.41 | 9.95 | 0.271692 | 19.076906 | false | 2024-05-04 | 2024-06-26 | 1 | vicgalle/Merge-Mistral-Prometheus-7B (Merge) |
vicgalle_Merge-Mixtral-Prometheus-8x7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mixtral-Prometheus-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mixtral-Prometheus-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mixtral-Prometheus-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Merge-Mixtral-Prometheus-8x7B | ba53ee5b52a81e56b01e919c069a0d045cfd4e83 | 24.794158 | apache-2.0 | 2 | 46 | true | false | false | false | true | 3.674009 | 0.574403 | 57.440259 | 0.53515 | 34.651421 | 0.094411 | 9.441088 | 0.308725 | 7.829978 | 0.40975 | 9.585417 | 0.368351 | 29.816785 | false | 2024-05-04 | 2024-06-26 | 1 | vicgalle/Merge-Mixtral-Prometheus-8x7B (Merge) |
vicgalle_Roleplay-Llama-3-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Roleplay-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Roleplay-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Roleplay-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Roleplay-Llama-3-8B | 57297eb57dcc2c116f061d9dda341094203da01b | 24.083124 | apache-2.0 | 36 | 8 | true | true | true | false | true | 1.126159 | 0.732022 | 73.202215 | 0.501232 | 28.554604 | 0.095166 | 9.516616 | 0.260906 | 1.454139 | 0.352885 | 1.677344 | 0.370844 | 30.093824 | false | 2024-04-19 | 2024-06-26 | 0 | vicgalle/Roleplay-Llama-3-8B |
vihangd_smart-dan-sft-v0.1_4bit | 4bit | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vihangd/smart-dan-sft-v0.1 | 924b4a09153d4061fa9d58f03b10cd7cde7e3084 | 3.783096 | apache-2.0 | 0 | 0 | true | true | true | false | false | 0.361025 | 0.157646 | 15.764616 | 0.306177 | 3.125599 | 0.004532 | 0.453172 | 0.255034 | 0.671141 | 0.350188 | 1.106771 | 0.114195 | 1.577275 | false | 2024-08-09 | 2024-08-20 | 0 | vihangd/smart-dan-sft-v0.1 |
vonjack_MobileLLM-125M-HF_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vonjack/MobileLLM-125M-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/MobileLLM-125M-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__MobileLLM-125M-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vonjack/MobileLLM-125M-HF | 7664f5e1b91faa04fac545f64db84c26316c7e63 | 5.464647 | cc-by-nc-4.0 | 0 | 0 | true | true | true | false | false | 0.171811 | 0.210728 | 21.072754 | 0.30273 | 3.146584 | 0.003021 | 0.302115 | 0.260067 | 1.342282 | 0.378187 | 5.106771 | 0.116356 | 1.817376 | false | 2024-11-15 | 2024-11-15 | 0 | vonjack/MobileLLM-125M-HF |
vonjack_Phi-3.5-mini-instruct-hermes-fc-json_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/vonjack/Phi-3.5-mini-instruct-hermes-fc-json" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3.5-mini-instruct-hermes-fc-json</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3.5-mini-instruct-hermes-fc-json-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vonjack/Phi-3.5-mini-instruct-hermes-fc-json | 4cacfb35723647d408f0414886d0dfe67404a14f | 4.516525 | apache-2.0 | 1 | 4 | true | true | true | false | true | 1.285189 | 0.141584 | 14.158433 | 0.297476 | 2.390836 | 0 | 0 | 0.254195 | 0.559284 | 0.404135 | 8.45026 | 0.113863 | 1.540337 | false | 2024-11-05 | 2024-11-05 | 1 | vonjack/Phi-3.5-mini-instruct-hermes-fc-json (Merge) |
vonjack_SmolLM2-135M-Merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-135M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-135M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-135M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vonjack/SmolLM2-135M-Merged | a1700ca913a87ad713edfe57a2030a9d7c088970 | 5.73396 | 0 | 0 | false | true | true | false | true | 0.34551 | 0.248297 | 24.829674 | 0.309993 | 4.587041 | 0.003021 | 0.302115 | 0.238255 | 0 | 0.366187 | 3.440104 | 0.111203 | 1.244829 | false | 2024-11-15 | 2024-11-15 | 1 | vonjack/SmolLM2-135M-Merged (Merge) |
|
vonjack_SmolLM2-360M-Merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-360M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-360M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-360M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vonjack/SmolLM2-360M-Merged | 32bceedf56b29a4a9fdd459a36fbc7fae5e274c8 | 7.130731 | 0 | 0 | false | true | true | false | true | 0.385742 | 0.320587 | 32.058715 | 0.315485 | 4.741734 | 0.007553 | 0.755287 | 0.255872 | 0.782998 | 0.352729 | 3.357813 | 0.109791 | 1.08784 | false | 2024-11-15 | 2024-11-15 | 1 | vonjack/SmolLM2-360M-Merged (Merge) |
|
w4r10ck_SOLAR-10.7B-Instruct-v1.0-uncensored_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/w4r10ck__SOLAR-10.7B-Instruct-v1.0-uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored | baa7b3899e85af4b2f02b01fd93f203872140d27 | 20.577181 | apache-2.0 | 30 | 10 | true | true | true | false | false | 0.801971 | 0.388406 | 38.84061 | 0.530153 | 33.858639 | 0.003021 | 0.302115 | 0.294463 | 5.928412 | 0.463948 | 18.49349 | 0.334358 | 26.03982 | false | 2023-12-14 | 2024-10-11 | 0 | w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored |
wannaphong_KhanomTanLLM-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/wannaphong/KhanomTanLLM-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wannaphong/KhanomTanLLM-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wannaphong__KhanomTanLLM-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | wannaphong/KhanomTanLLM-Instruct | 351239c92c0ff3304d1dd98fdf4ac054a8c1acc3 | 4.617874 | apache-2.0 | 1 | 3 | true | true | true | false | true | 0.401731 | 0.162118 | 16.211763 | 0.309312 | 3.944866 | 0.001511 | 0.151057 | 0.263423 | 1.789709 | 0.370062 | 4.291146 | 0.111868 | 1.318706 | false | 2024-08-24 | 2024-08-29 | 0 | wannaphong/KhanomTanLLM-Instruct |
waqasali1707_Beast-Soul-new_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/waqasali1707/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">waqasali1707/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/waqasali1707__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | waqasali1707/Beast-Soul-new | a23d68c4556d91a129de3f8fd8b9e0ff0890f4cc | 22.108388 | 0 | 7 | false | true | true | false | false | 0.636888 | 0.502987 | 50.298652 | 0.522495 | 33.044262 | 0.070242 | 7.024169 | 0.282718 | 4.362416 | 0.448563 | 14.503646 | 0.310755 | 23.417184 | false | 2024-08-07 | 2024-08-07 | 1 | waqasali1707/Beast-Soul-new (Merge) |
|
wave-on-discord_qwent-7b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/wave-on-discord/qwent-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wave-on-discord/qwent-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wave-on-discord__qwent-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | wave-on-discord/qwent-7b | 40000e76d2a4d0ad054aff9fe873c5beb0e4925e | 8.734093 | 0 | 7 | false | true | true | false | false | 1.323496 | 0.201485 | 20.148539 | 0.42281 | 18.066398 | 0 | 0 | 0.265101 | 2.013423 | 0.381656 | 5.473698 | 0.160322 | 6.702497 | false | 2024-09-30 | 2024-09-30 | 1 | wave-on-discord/qwent-7b (Merge) |
|
win10_Breeze-13B-32k-Instruct-v1_0_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/win10/Breeze-13B-32k-Instruct-v1_0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Breeze-13B-32k-Instruct-v1_0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Breeze-13B-32k-Instruct-v1_0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | win10/Breeze-13B-32k-Instruct-v1_0 | 220c957cf5d9c534a4ef75c11a18221c461de40a | 15.411206 | apache-2.0 | 0 | 12 | true | false | true | false | true | 1.448811 | 0.358431 | 35.843118 | 0.461123 | 25.258699 | 0.009819 | 0.981873 | 0.264262 | 1.901566 | 0.420198 | 11.058073 | 0.256815 | 17.423907 | false | 2024-06-26 | 2024-06-26 | 0 | win10/Breeze-13B-32k-Instruct-v1_0 |
win10_Llama-3.2-3B-Instruct-24-9-29_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | win10/Llama-3.2-3B-Instruct-24-9-29 | 4defb10e2415111abb873d695dd40c387c1d6d57 | 23.929169 | llama3.2 | 0 | 3 | true | true | true | false | true | 0.713606 | 0.733221 | 73.322119 | 0.461423 | 24.196426 | 0.166163 | 16.616314 | 0.274329 | 3.243848 | 0.355521 | 1.440104 | 0.322806 | 24.756206 | false | 2024-09-29 | 2024-10-11 | 2 | meta-llama/Llama-3.2-3B-Instruct |
win10_llama3-13.45b-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | win10/llama3-13.45b-Instruct | 94cc0f415e355c6d3d47168a6ff5239ca586904a | 17.277282 | llama3 | 1 | 13 | true | false | true | false | true | 2.136535 | 0.414435 | 41.443481 | 0.486542 | 26.67569 | 0.020393 | 2.039275 | 0.258389 | 1.118568 | 0.38476 | 6.328385 | 0.334525 | 26.058289 | false | 2024-06-09 | 2024-06-26 | 1 | win10/llama3-13.45b-Instruct (Merge) |
winglian_Llama-3-8b-64k-PoSE_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/winglian/Llama-3-8b-64k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/Llama-3-8b-64k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__Llama-3-8b-64k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | winglian/Llama-3-8b-64k-PoSE | 5481d9b74a3ec5a95789673e194c8ff86e2bc2bc | 11.004738 | 74 | 8 | false | true | true | false | true | 0.911021 | 0.285691 | 28.569086 | 0.370218 | 13.307317 | 0.033233 | 3.323263 | 0.260906 | 1.454139 | 0.339552 | 3.077344 | 0.246676 | 16.297281 | false | 2024-04-24 | 2024-06-26 | 0 | winglian/Llama-3-8b-64k-PoSE |
|
winglian_llama-3-8b-256k-PoSE_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | winglian/llama-3-8b-256k-PoSE | 93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b | 6.557715 | 42 | 8 | false | true | true | false | true | 1.050723 | 0.290911 | 29.091145 | 0.315658 | 5.502849 | 0.015106 | 1.510574 | 0.25755 | 1.006711 | 0.331552 | 0.94401 | 0.111619 | 1.291002 | false | 2024-04-26 | 2024-06-26 | 0 | winglian/llama-3-8b-256k-PoSE |
|
xMaulana_FinMatcha-3B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xMaulana/FinMatcha-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xMaulana/FinMatcha-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xMaulana__FinMatcha-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xMaulana/FinMatcha-3B-Instruct | be2c0c04fc4dc3fb93631e3c663721da92fea8fc | 24.016243 | apache-2.0 | 0 | 3 | true | true | true | false | true | 6.577035 | 0.754828 | 75.48283 | 0.453555 | 23.191023 | 0.135952 | 13.595166 | 0.269295 | 2.572707 | 0.363333 | 5.016667 | 0.318152 | 24.239066 | false | 2024-09-29 | 2024-10-22 | 1 | xMaulana/FinMatcha-3B-Instruct (Merge) |
xinchen9_Llama3.1_8B_Instruct_CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_8B_Instruct_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_8B_Instruct_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_8B_Instruct_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_8B_Instruct_CoT | cab1b33ddff08de11c5daea8ae079d126d503d8b | 16.190743 | apache-2.0 | 0 | 8 | true | true | true | false | false | 1.856552 | 0.297357 | 29.735657 | 0.439821 | 21.142866 | 0.05287 | 5.287009 | 0.302013 | 6.935123 | 0.437062 | 13.166146 | 0.287899 | 20.87766 | false | 2024-09-16 | 2024-09-19 | 0 | xinchen9/Llama3.1_8B_Instruct_CoT |
xinchen9_Llama3.1_CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_CoT | 3cb467f51a59ff163bb942fcde3ef60573c12b79 | 13.351283 | apache-2.0 | 0 | 8 | true | true | true | false | true | 0.950099 | 0.224616 | 22.461624 | 0.434101 | 19.899124 | 0.015106 | 1.510574 | 0.288591 | 5.145414 | 0.430458 | 11.773958 | 0.273853 | 19.317007 | false | 2024-09-04 | 2024-09-06 | 0 | xinchen9/Llama3.1_CoT |
xinchen9_Llama3.1_CoT_V1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_CoT_V1 | c5ed4b8bfc364ebae1843af14799818551f5251f | 14.394947 | apache-2.0 | 0 | 8 | true | true | true | false | false | 1.873462 | 0.245299 | 24.529914 | 0.4376 | 20.166003 | 0.01284 | 1.283988 | 0.279362 | 3.914989 | 0.457219 | 16.41901 | 0.280502 | 20.055777 | false | 2024-09-06 | 2024-09-07 | 0 | xinchen9/Llama3.1_CoT_V1 |
xinchen9_Mistral-7B-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Mistral-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Mistral-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Mistral-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Mistral-7B-CoT | 9a3c8103dac20d5497d1b8fc041bb5125ff4dc00 | 11.202955 | apache-2.0 | 0 | 7 | true | true | true | false | false | 1.888689 | 0.279871 | 27.987074 | 0.387268 | 14.806193 | 0.019637 | 1.963746 | 0.249161 | 0 | 0.399427 | 8.195052 | 0.228391 | 14.265662 | false | 2024-09-09 | 2024-09-23 | 0 | xinchen9/Mistral-7B-CoT |
xinchen9_llama3-b8-ft-dis_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/llama3-b8-ft-dis | e4da730f28f79543262de37908943c35f8df81fe | 13.897963 | apache-2.0 | 0 | 8 | true | true | true | false | false | 1.062327 | 0.154599 | 15.459869 | 0.462579 | 24.727457 | 0.034743 | 3.47432 | 0.312919 | 8.389262 | 0.365375 | 6.405208 | 0.324385 | 24.931664 | false | 2024-06-28 | 2024-07-11 | 0 | xinchen9/llama3-b8-ft-dis |
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table | c083d6796f54f66b4cec2261657a02801c761093 | 22.421029 | 0 | 8 | false | true | true | false | true | 0.624231 | 0.637475 | 63.747523 | 0.491227 | 27.422821 | 0.067976 | 6.797583 | 0.259228 | 1.230425 | 0.382 | 5.483333 | 0.3686 | 29.844489 | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table | 5416d34b5243559914a377ee9d95ce4830bf8dba | 24.502405 | 0 | 8 | false | true | true | false | true | 0.750264 | 0.727451 | 72.745094 | 0.505686 | 29.398353 | 0.084592 | 8.459215 | 0.260067 | 1.342282 | 0.381906 | 5.104948 | 0.369681 | 29.964539 | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table | 235204157d7fac0d64fa609d5aee3cebb49ccd11 | 22.236354 | 0 | 8 | false | true | true | false | true | 0.671741 | 0.656859 | 65.685936 | 0.495183 | 27.6952 | 0.064955 | 6.495468 | 0.259228 | 1.230425 | 0.359396 | 2.291146 | 0.37018 | 30.019947 | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table | 9db00cbbba84453b18956fcc76f264f94a205955 | 22.935265 | 0 | 8 | false | true | true | false | true | 0.719228 | 0.66208 | 66.207995 | 0.500449 | 28.508587 | 0.077795 | 7.779456 | 0.259228 | 1.230425 | 0.380542 | 5.001042 | 0.359957 | 28.884087 | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 | 1062757826de031a4ae82277e6e737e19e82e514 | 21.845481 | 0 | 8 | false | true | true | false | true | 0.615003 | 0.604228 | 60.422789 | 0.493606 | 27.613714 | 0.064955 | 6.495468 | 0.259228 | 1.230425 | 0.379333 | 5.216667 | 0.370844 | 30.093824 | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 | e5d2f179b4a7bd851dcf2b7db6358b13001bf1af | 23.938825 | 0 | 8 | false | true | true | false | true | 0.841468 | 0.713188 | 71.318768 | 0.499638 | 28.574879 | 0.069486 | 6.94864 | 0.258389 | 1.118568 | 0.387208 | 6.067708 | 0.366439 | 29.604388 | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 | 0e319ad47ed2b2636b72d07ee9b32657e1e50412 | 21.224624 | 0 | 8 | false | true | true | false | true | 0.679841 | 0.594711 | 59.471092 | 0.489922 | 26.943904 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.358094 | 2.328385 | 0.370429 | 30.047651 | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 | 0877f2458ea667edcf9213383df41294c788190f | 22.69358 | 0 | 8 | false | true | true | false | true | 0.769119 | 0.645319 | 64.531887 | 0.495108 | 28.046978 | 0.067976 | 6.797583 | 0.260067 | 1.342282 | 0.393875 | 7.334375 | 0.352975 | 28.108378 | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table | d2b87100e5ba3215fddbd308bb17b7bf12fe6c9e | 21.01778 | 0 | 8 | false | true | true | false | true | 0.98643 | 0.575602 | 57.560163 | 0.490121 | 26.866404 | 0.079305 | 7.930514 | 0.259228 | 1.230425 | 0.365969 | 2.979427 | 0.365858 | 29.539746 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table | 19a48ccf5ea463afbbbc61d650b8fb63ff2d94c7 | 23.969226 | 0 | 8 | false | true | true | false | true | 0.590153 | 0.703446 | 70.344575 | 0.509187 | 29.731239 | 0.086858 | 8.685801 | 0.259228 | 1.230425 | 0.373906 | 3.904948 | 0.369265 | 29.918366 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table | 0fe230b3432fb2b0f89942d7926291a4dbeb2820 | 21.781466 | 0 | 8 | false | true | true | false | true | 0.665521 | 0.602379 | 60.237946 | 0.496953 | 27.892403 | 0.086103 | 8.610272 | 0.259228 | 1.230425 | 0.367365 | 3.18724 | 0.365775 | 29.530511 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table | d1e19da1029f2d4d45de015754bc52dcb1ea5570 | 23.059714 | 0 | 8 | false | true | true | false | true | 0.588419 | 0.66203 | 66.203008 | 0.499994 | 28.439824 | 0.083082 | 8.308157 | 0.259228 | 1.230425 | 0.381812 | 5.126562 | 0.361453 | 29.05031 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 | a478aa202c59773eba615ae37feb4cc750757695 | 20.364052 | 0 | 8 | false | true | true | false | true | 0.586443 | 0.533636 | 53.363631 | 0.491487 | 27.145374 | 0.06571 | 6.570997 | 0.259228 | 1.230425 | 0.377969 | 4.71276 | 0.36245 | 29.161126 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 | 8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9 | 23.261322 | 0 | 8 | false | true | true | false | true | 0.869474 | 0.685161 | 68.516093 | 0.507516 | 29.74055 | 0.054381 | 5.438066 | 0.258389 | 1.118568 | 0.383177 | 5.630469 | 0.362118 | 29.124187 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 | 86673872245ad902f8d466bdc20edae9c115b965 | 20.032169 | 0 | 8 | false | true | true | false | true | 0.675094 | 0.548224 | 54.822427 | 0.488717 | 26.839803 | 0.044562 | 4.456193 | 0.260906 | 1.454139 | 0.363271 | 2.942187 | 0.367104 | 29.678265 | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 |
|
xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table | abb3afe2b0398b24ed823b0124c8a72d354487bd | 23.498955 | 0 | 8 | false | true | true | false | true | 1.379342 | 0.690931 | 69.093117 | 0.497846 | 28.119887 | 0.0929 | 9.29003 | 0.259228 | 1.230425 | 0.367333 | 3.083333 | 0.371592 | 30.176936 | false | 2024-09-22 | 2024-09-23 | 0 | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table |
|
xxx777xxxASD_L3.1-ClaudeMaid-4x8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/xxx777xxxASD/L3.1-ClaudeMaid-4x8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xxx777xxxASD/L3.1-ClaudeMaid-4x8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xxx777xxxASD__L3.1-ClaudeMaid-4x8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xxx777xxxASD/L3.1-ClaudeMaid-4x8B | 2a98d9cb91c7aa775acbf5bfe7bb91beb2faf682 | 26.190883 | llama3.1 | 7 | 24 | true | true | false | false | true | 2.376185 | 0.669649 | 66.964875 | 0.507085 | 29.437348 | 0.128399 | 12.839879 | 0.291107 | 5.480984 | 0.428937 | 13.750521 | 0.358045 | 28.67169 | false | 2024-07-27 | 2024-07-28 | 0 | xxx777xxxASD/L3.1-ClaudeMaid-4x8B |
yam-peleg_Hebrew-Gemma-11B-Instruct_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Gemma-11B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Gemma-11B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Gemma-11B-Instruct | a40259d1efbcac4829ed44d3b589716f615ed362 | 13.919763 | other | 22 | 10 | true | true | true | false | true | 1.937267 | 0.302077 | 30.207738 | 0.403578 | 16.862741 | 0.057402 | 5.740181 | 0.276007 | 3.467562 | 0.408854 | 9.973438 | 0.255402 | 17.266918 | false | 2024-03-06 | 2024-07-31 | 0 | yam-peleg/Hebrew-Gemma-11B-Instruct |
yam-peleg_Hebrew-Mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B | 3d32134b5959492fd7efbbf16395352594bc89f7 | 13.302117 | apache-2.0 | 61 | 7 | true | true | true | false | false | 1.399281 | 0.232834 | 23.283443 | 0.433404 | 20.17694 | 0.049849 | 4.984894 | 0.279362 | 3.914989 | 0.397656 | 7.673698 | 0.278009 | 19.778738 | false | 2024-04-26 | 2024-07-11 | 0 | yam-peleg/Hebrew-Mistral-7B |
yam-peleg_Hebrew-Mistral-7B-200K_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B-200K | 7b51c7b31e3d9e29ea964c579a45233cfad255fe | 10.644291 | apache-2.0 | 15 | 7 | true | true | true | false | false | 0.735312 | 0.185573 | 18.557317 | 0.414927 | 17.493603 | 0.023414 | 2.34139 | 0.276007 | 3.467562 | 0.376479 | 4.526563 | 0.257314 | 17.479314 | false | 2024-05-05 | 2024-07-11 | 0 | yam-peleg/Hebrew-Mistral-7B-200K |
yam-peleg_Hebrew-Mistral-7B-200K_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B-200K | 7b51c7b31e3d9e29ea964c579a45233cfad255fe | 8.235612 | apache-2.0 | 15 | 7 | true | true | true | false | true | 1.684494 | 0.17698 | 17.698041 | 0.34105 | 7.671324 | 0.021903 | 2.190332 | 0.253356 | 0.447427 | 0.374 | 4.416667 | 0.252909 | 16.989879 | false | 2024-05-05 | 2024-08-06 | 0 | yam-peleg/Hebrew-Mistral-7B-200K |
ycros_BagelMIsteryTour-v2-8x7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ycros/BagelMIsteryTour-v2-8x7B | 98a8b319707be3dab1659594da69a37ed8f8c148 | 24.258614 | cc-by-nc-4.0 | 16 | 46 | true | false | true | false | true | 3.649132 | 0.599432 | 59.943173 | 0.515924 | 31.699287 | 0.07855 | 7.854985 | 0.30453 | 7.270694 | 0.420292 | 11.303125 | 0.347324 | 27.480423 | false | 2024-01-19 | 2024-06-28 | 1 | ycros/BagelMIsteryTour-v2-8x7B (Merge) |
ycros_BagelMIsteryTour-v2-8x7B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ycros/BagelMIsteryTour-v2-8x7B | 98a8b319707be3dab1659594da69a37ed8f8c148 | 24.724802 | cc-by-nc-4.0 | 16 | 46 | true | false | true | false | true | 3.619337 | 0.62621 | 62.620957 | 0.514194 | 31.366123 | 0.087613 | 8.761329 | 0.307886 | 7.718121 | 0.41375 | 10.31875 | 0.348072 | 27.563534 | false | 2024-01-19 | 2024-08-04 | 1 | ycros/BagelMIsteryTour-v2-8x7B (Merge) |
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table | 97b2d0e790a6fcdf39c34a2043f0818368c7dcb3 | 22.974571 | 0 | 8 | false | true | true | false | true | 0.618253 | 0.670898 | 67.089766 | 0.498661 | 28.170107 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.372698 | 3.853906 | 0.371592 | 30.176936 | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table | e8786291c206d5cd1b01d29466e3b397278f4e2b | 24.877776 | 0 | 8 | false | true | true | false | true | 0.640663 | 0.733271 | 73.327105 | 0.508036 | 29.308128 | 0.097432 | 9.743202 | 0.260067 | 1.342282 | 0.380604 | 5.008854 | 0.374834 | 30.537086 | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table | 0d9cb29aa87b0c17ed011ffbc83803f3f6dd18e7 | 23.168114 | 0 | 8 | false | true | true | false | true | 0.679554 | 0.678466 | 67.846647 | 0.494121 | 27.469588 | 0.095166 | 9.516616 | 0.259228 | 1.230425 | 0.364667 | 2.75 | 0.371759 | 30.195405 | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table | 7a326a956e6169b287a04ef93cdc0342a0f3311a | 24.001677 | 0 | 8 | false | true | true | false | true | 0.648184 | 0.713188 | 71.318768 | 0.502536 | 28.604424 | 0.093656 | 9.365559 | 0.259228 | 1.230425 | 0.371333 | 3.683333 | 0.368268 | 29.80755 | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001 | e5c8baadbf6ce17b344596ad42bd3546f66e253e | 22.364867 | 0 | 8 | false | true | true | false | true | 0.582235 | 0.649565 | 64.956538 | 0.497946 | 28.099199 | 0.048338 | 4.833837 | 0.259228 | 1.230425 | 0.377969 | 4.846094 | 0.372008 | 30.223109 | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002 | 064e237b850151938caf171a4c8c7e34c93e580e | 24.319539 | 0 | 8 | false | true | true | false | true | 0.606022 | 0.719607 | 71.960731 | 0.504515 | 28.785911 | 0.07855 | 7.854985 | 0.260067 | 1.342282 | 0.383146 | 5.593229 | 0.373421 | 30.380098 | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001 | b685b90063258e05f8b4930fdbce2e565f13f620 | 22.384837 | 0 | 8 | false | true | true | false | true | 0.649092 | 0.65044 | 65.043972 | 0.495788 | 27.825253 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.366031 | 2.853906 | 0.370263 | 30.029181 | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002 | 5ab3f2cfc96bdda3b5a629ab4a81adf7394ba90a | 23.522522 | 0 | 8 | false | true | true | false | true | 0.60769 | 0.701597 | 70.159732 | 0.499155 | 28.120615 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.377906 | 4.638281 | 0.366938 | 29.659796 | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002 |
|
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002 | 7a046b74179225d6055dd8aa601b5234f817b1e5 | 22.624782 | 0 | 8 | false | true | true | false | true | 0.672016 | 0.648966 | 64.896586 | 0.491452 | 27.281064 | 0.068731 | 6.873112 | 0.261745 | 1.565996 | 0.389875 | 7.134375 | 0.351978 | 27.997562 | false | 2024-09-30 | 0 | Removed |
||
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8 | 2c00dbc74e55d42fbc8b08f474fb9568f820edb9 | 9.604139 | apache-2.0 | 0 | 1 | true | true | true | false | false | 0.548428 | 0.150527 | 15.052727 | 0.397557 | 15.175392 | 0 | 0 | 0.28943 | 5.257271 | 0.387458 | 6.765625 | 0.238364 | 15.373818 | false | 2024-11-13 | 2024-11-13 | 0 | ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8 |
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18 | aed2a9061ffa21beaec0d617a9605e160136aab4 | 14.633781 | gemma | 0 | 2 | true | true | true | false | true | 6.200402 | 0.463095 | 46.309459 | 0.40529 | 16.301992 | 0.003776 | 0.377644 | 0.288591 | 5.145414 | 0.375427 | 4.728385 | 0.234458 | 14.93979 | false | 2024-10-30 | 2024-11-16 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge | b72be0a7879f0d82cb2024cfc1d02c370ce3efe8 | 15.737663 | gemma | 0 | 2 | true | true | true | false | true | 1.98799 | 0.521821 | 52.182099 | 0.414689 | 17.348337 | 0.008308 | 0.830816 | 0.283557 | 4.474273 | 0.351396 | 3.357813 | 0.246094 | 16.232639 | false | 2024-10-30 | 2024-11-16 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17 | e6f82b93dae0b8207aa3252ab4157182e2610787 | 15.002982 | gemma | 1 | 2 | true | true | true | false | true | 1.104509 | 0.508157 | 50.815724 | 0.407627 | 16.234749 | 0 | 0 | 0.271812 | 2.908277 | 0.370062 | 3.891146 | 0.245512 | 16.167996 | false | 2024-10-16 | 2024-10-18 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24 | 38f56fcb99bd64278a1d90dd23aea527036329a0 | 14.019765 | gemma | 0 | 2 | true | true | true | false | true | 0.704859 | 0.505484 | 50.548434 | 0.381236 | 13.114728 | 0 | 0 | 0.28104 | 4.138702 | 0.350156 | 2.069531 | 0.228225 | 14.247193 | false | 2024-11-06 | 2024-11-06 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO | 531b2e2043285cb40cd0433f5ad43441f8ac6b6c | 14.516851 | gemma | 1 | 2 | true | true | true | false | true | 9.681597 | 0.474785 | 47.478468 | 0.389798 | 14.389413 | 0.042296 | 4.229607 | 0.274329 | 3.243848 | 0.37676 | 4.528385 | 0.219082 | 13.231383 | false | 2024-10-18 | 2024-10-27 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca | 5503b5e892be463fa4b1d265b8ba9ba4304af012 | 12.001731 | gemma | 2 | 2 | true | true | true | false | true | 1.184666 | 0.306473 | 30.647349 | 0.40716 | 16.922412 | 0.000755 | 0.075529 | 0.269295 | 2.572707 | 0.396917 | 7.914583 | 0.2249 | 13.877807 | false | 2024-10-27 | 2024-10-27 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-18 | c50b85f9b60b444f85fe230b8d77fcbc7b18ef91 | 15.503245 | gemma | 1 | 2 | true | true | true | false | true | 1.052664 | 0.517525 | 51.752461 | 0.413219 | 17.143415 | 0 | 0 | 0.27349 | 3.131991 | 0.374156 | 4.269531 | 0.250499 | 16.722074 | false | 2024-10-15 | 2024-10-18 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO | b9f41f53827b8a5a600546b41f63023bf84617a3 | 14.943472 | gemma | 0 | 2 | true | true | true | false | true | 1.610377 | 0.474235 | 47.423503 | 0.403894 | 16.538079 | 0.035498 | 3.549849 | 0.261745 | 1.565996 | 0.395333 | 7.416667 | 0.218501 | 13.166741 | false | 2024-10-22 | 2024-10-22 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-24 | 06c129ba5261ee88e32035c88f90ca11d835175d | 15.604076 | gemma | 0 | 2 | true | true | true | false | true | 0.810442 | 0.497866 | 49.786566 | 0.41096 | 16.77259 | 0 | 0 | 0.277685 | 3.691275 | 0.39149 | 7.002865 | 0.24734 | 16.371158 | false | 2024-10-24 | 2024-10-25 | 3 | google/gemma-2-2b |
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO | 0da9f780f7dd94ed1e10c8d3e082472ff2922177 | 18.075579 | apache-2.0 | 0 | 8 | true | true | true | false | true | 0.97203 | 0.46909 | 46.908979 | 0.439987 | 21.238563 | 0.055891 | 5.589124 | 0.302013 | 6.935123 | 0.401219 | 9.61901 | 0.263464 | 18.162677 | false | 2024-09-24 | 2024-09-30 | 0 | yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO |
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties | 998d15b32900bc230727c8a7984e005f611723e9 | 19.134801 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.914144 | 0.401271 | 40.127085 | 0.461579 | 23.492188 | 0.082326 | 8.232628 | 0.275168 | 3.355705 | 0.421094 | 11.003385 | 0.35738 | 28.597813 | false | 2024-09-22 | 2024-09-23 | 1 | yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge) |
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-abliterated-Spectrum-slerp | 28789950975ecf5aac846c3f2c0a5d6841651ee6 | 17.687552 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.82666 | 0.288488 | 28.848788 | 0.497791 | 28.54693 | 0.058157 | 5.81571 | 0.301174 | 6.823266 | 0.399823 | 11.011198 | 0.325715 | 25.079418 | false | 2024-09-22 | 2024-09-23 | 1 | yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge) |
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zake7749/gemma-2-2b-it-chinese-kyara-dpo | bbc011dae0416c1664a0287f3a7a0f9563deac91 | 19.334585 | gemma | 6 | 2 | true | true | true | false | false | 1.279309 | 0.538208 | 53.820751 | 0.425746 | 19.061804 | 0.066465 | 6.646526 | 0.266779 | 2.237136 | 0.457563 | 16.761979 | 0.257314 | 17.479314 | false | 2024-08-18 | 2024-10-17 | 1 | zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge) |
zelk12_Gemma-2-TM-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Gemma-2-TM-9B | 42366d605e6bdad354a5632547e37d34d300ff7a | 30.151929 | 0 | 10 | false | true | true | false | true | 1.967893 | 0.804462 | 80.446216 | 0.598659 | 42.049491 | 0 | 0 | 0.346477 | 12.863535 | 0.41524 | 11.238281 | 0.408826 | 34.314051 | false | 2024-11-06 | 2024-11-06 | 1 | zelk12/Gemma-2-TM-9B (Merge) |
|
zelk12_MT-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen1-gemma-2-9B | b78f8883614cbbdf182ebb4acf8a8c124bc782ae | 33.041356 | 0 | 10 | false | true | true | false | true | 3.362746 | 0.788625 | 78.862529 | 0.61 | 44.011247 | 0.133686 | 13.36858 | 0.346477 | 12.863535 | 0.421688 | 11.577604 | 0.438082 | 37.564642 | false | 2024-10-23 | 2024-10-23 | 1 | zelk12/MT-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen2-gemma-2-9B | c723f8b9b7334fddd1eb8b6e5230b76fb18139a5 | 33.644495 | 1 | 10 | false | true | true | false | true | 1.989448 | 0.790749 | 79.074855 | 0.610049 | 44.107782 | 0.148792 | 14.879154 | 0.346477 | 12.863535 | 0.432292 | 13.303125 | 0.438747 | 37.63852 | false | 2024-11-10 | 2024-11-10 | 1 | zelk12/MT-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT-Merge-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge-gemma-2-9B | f4c3b001bc8692bcbbd7005b6f8db048e651aa46 | 33.393208 | 3 | 10 | false | true | true | false | true | 3.219056 | 0.803538 | 80.353795 | 0.611838 | 44.320842 | 0.13142 | 13.141994 | 0.348154 | 13.087248 | 0.425625 | 12.103125 | 0.43617 | 37.352246 | false | 2024-10-22 | 2024-10-22 | 1 | zelk12/MT-Merge-gemma-2-9B (Merge) |
|
zelk12_MT-Merge1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge1-gemma-2-9B | 71bb4577c877715f3f6646a224b184544639c856 | 33.130536 | 1 | 10 | false | true | true | false | true | 4.036662 | 0.788625 | 78.862529 | 0.61 | 44.058246 | 0.126888 | 12.688822 | 0.35151 | 13.534676 | 0.424385 | 12.148177 | 0.437417 | 37.490765 | false | 2024-11-07 | 2024-11-07 | 1 | zelk12/MT-Merge1-gemma-2-9B (Merge) |
|
zelk12_MT-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-gemma-2-9B | 24e1f894517b86dd866c1a5999ced4a5924dcd90 | 30.239612 | 2 | 10 | false | true | true | false | true | 3.023399 | 0.796843 | 79.684349 | 0.60636 | 43.324243 | 0.003021 | 0.302115 | 0.345638 | 12.751678 | 0.407115 | 9.55599 | 0.422374 | 35.819297 | false | 2024-10-11 | 2024-10-11 | 1 | zelk12/MT-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen1-gemma-2-9B | 939ac6c12059a18fc1117cdb3861f46816eff2fb | 33.232259 | 0 | 10 | false | true | true | false | true | 3.362485 | 0.797443 | 79.744301 | 0.611779 | 44.273282 | 0.122356 | 12.23565 | 0.34396 | 12.527964 | 0.430958 | 13.103125 | 0.437583 | 37.509235 | false | 2024-10-23 | 2024-10-24 | 1 | zelk12/MT1-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen2-gemma-2-9B | aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4 | 33.142398 | 2 | 10 | false | true | true | false | true | 1.995995 | 0.798367 | 79.836722 | 0.609599 | 43.919191 | 0.113293 | 11.329305 | 0.352349 | 13.646532 | 0.428354 | 12.844271 | 0.435505 | 37.278369 | false | 2024-11-11 | 2024-11-11 | 1 | zelk12/MT1-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-gemma-2-9B | 3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed | 33.633829 | 1 | 10 | false | true | true | false | true | 3.345719 | 0.79467 | 79.467036 | 0.610875 | 44.161526 | 0.149547 | 14.954683 | 0.345638 | 12.751678 | 0.432229 | 13.161979 | 0.435755 | 37.306073 | false | 2024-10-12 | 2024-10-14 | 1 | zelk12/MT1-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen1-gemma-2-9B | 167abf8eb4ea01fecd42dc32ad68160c51a8685a | 32.460223 | 0 | 10 | false | true | true | false | true | 3.38321 | 0.785578 | 78.557782 | 0.61008 | 44.141103 | 0.101208 | 10.120846 | 0.343121 | 12.416107 | 0.424323 | 12.007031 | 0.437666 | 37.518469 | false | 2024-10-24 | 2024-10-27 | 1 | zelk12/MT2-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen2-gemma-2-9B | 24c487499b5833424ffb9932eed838bb254f61b4 | 33.471172 | 3 | 10 | false | true | true | false | true | 2.037441 | 0.7889 | 78.890012 | 0.609292 | 44.044503 | 0.148036 | 14.803625 | 0.346477 | 12.863535 | 0.427021 | 12.577604 | 0.43883 | 37.647754 | false | 2024-11-12 | 2024-11-12 | 1 | zelk12/MT2-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-gemma-2-9B | d20d7169ce0f53d586504c50b4b7dc470bf8a781 | 33.2825 | 1 | 10 | false | true | true | false | true | 3.19411 | 0.788575 | 78.857542 | 0.611511 | 44.167481 | 0.147281 | 14.728097 | 0.347315 | 12.975391 | 0.421656 | 11.540365 | 0.436835 | 37.426123 | false | 2024-10-14 | 2024-10-15 | 1 | zelk12/MT2-gemma-2-9B (Merge) |
|
zelk12_MT3-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen1-gemma-2-9B | cd78df9e67e2e710d8d305f5a03a92c01b1b425d | 31.054845 | 1 | 10 | false | true | true | false | true | 3.113666 | 0.783779 | 78.377926 | 0.610676 | 44.119495 | 0.032477 | 3.247734 | 0.346477 | 12.863535 | 0.415115 | 10.75599 | 0.43268 | 36.964391 | false | 2024-10-24 | 2024-10-28 | 1 | zelk12/MT3-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-gemma-2-9B | d501b6ea59896fac3dc0a623501a5493b3573cde | 32.352524 | 1 | 10 | false | true | true | false | true | 3.136653 | 0.778609 | 77.860854 | 0.613078 | 44.248465 | 0.104985 | 10.498489 | 0.344799 | 12.639821 | 0.424292 | 11.903125 | 0.43268 | 36.964391 | false | 2024-10-15 | 2024-10-16 | 1 | zelk12/MT3-gemma-2-9B (Merge) |
|
zelk12_MT4-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen1-gemma-2-9B | 6ed2c66246c7f354decfd3579acb534dc4b0b48c | 33.544994 | 0 | 10 | false | true | true | false | true | 2.103561 | 0.7895 | 78.949964 | 0.609383 | 44.009524 | 0.150302 | 15.030211 | 0.34396 | 12.527964 | 0.432229 | 13.095313 | 0.438913 | 37.656989 | false | 2024-10-25 | 2024-10-29 | 1 | zelk12/MT4-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-gemma-2-9B | 2167ea02baf9145a697a7d828a17c75b86e5e282 | 33.447349 | 0 | 10 | false | true | true | false | true | 3.155259 | 0.776161 | 77.616059 | 0.607314 | 43.553827 | 0.173716 | 17.371601 | 0.338087 | 11.744966 | 0.430927 | 12.999219 | 0.436586 | 37.398419 | false | 2024-10-16 | 2024-10-20 | 1 | zelk12/MT4-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen1-gemma-2-9B | 0291b776e80f38381788cd8f1fb2c3435ad891b5 | 31.897632 | 0 | 10 | false | true | true | false | true | 2.017253 | 0.78313 | 78.312987 | 0.611048 | 44.183335 | 0.068731 | 6.873112 | 0.347315 | 12.975391 | 0.420385 | 11.614844 | 0.436835 | 37.426123 | false | 2024-10-25 | 2024-10-31 | 1 | zelk12/MT5-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-gemma-2-9B | b627ae7d796b1ae85b59c55e0e043b8d3ae73d83 | 32.595305 | 0 | 10 | false | true | true | false | true | 3.26983 | 0.804787 | 80.478685 | 0.611223 | 44.271257 | 0.095166 | 9.516616 | 0.343121 | 12.416107 | 0.420385 | 11.48151 | 0.436669 | 37.407654 | false | 2024-10-19 | 2024-10-21 | 1 | zelk12/MT5-gemma-2-9B (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 | b4208ddf6c741884c16c77b9433d9ead8f216354 | 30.344893 | 2 | 10 | false | true | true | false | true | 3.443191 | 0.764895 | 76.489492 | 0.607451 | 43.706516 | 0.013595 | 1.359517 | 0.349832 | 13.310962 | 0.413625 | 10.303125 | 0.432098 | 36.899749 | false | 2024-10-03 | 2024-10-03 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | e652c9e07265526851dad994f4640aa265b9ab56 | 33.300246 | 1 | 10 | false | true | true | false | true | 3.194991 | 0.770665 | 77.066517 | 0.607543 | 43.85035 | 0.155589 | 15.558912 | 0.343121 | 12.416107 | 0.43226 | 13.132552 | 0.439993 | 37.777039 | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 | eb0e589291630ba20328db650f74af949d217a97 | 28.421762 | 0 | 10 | false | true | true | false | true | 3.751453 | 0.720806 | 72.080635 | 0.59952 | 42.487153 | 0 | 0 | 0.349832 | 13.310962 | 0.395115 | 7.75599 | 0.414063 | 34.895833 | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 | 76f56b25bf6d8704282f8c77bfda28ca384883bc | 30.113979 | 1 | 10 | false | true | true | false | true | 3.413675 | 0.759999 | 75.999902 | 0.606626 | 43.633588 | 0.012085 | 1.208459 | 0.348154 | 13.087248 | 0.410958 | 9.836458 | 0.432264 | 36.918218 | false | 2024-10-07 | 2024-10-11 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge) |
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 | 1e3e623e9f0b386bfd967c629dd39c87daef5bed | 31.626376 | 1 | 10 | false | true | true | false | true | 6.461752 | 0.761523 | 76.152276 | 0.609878 | 43.941258 | 0.073263 | 7.326284 | 0.341443 | 12.192394 | 0.431021 | 13.310937 | 0.431516 | 36.835106 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 | 8af6620b39c9a36239879b6b2bd88f66e9e9d930 | 32.254423 | 0 | 10 | false | true | true | false | true | 6.542869 | 0.794396 | 79.439554 | 0.60644 | 43.39057 | 0.09139 | 9.138973 | 0.35151 | 13.534676 | 0.420229 | 11.095313 | 0.432347 | 36.927453 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 | ced039b03be6f65ac0f713efcee76c6534e65639 | 32.448061 | 0 | 10 | false | true | true | false | true | 3.13222 | 0.744537 | 74.453672 | 0.597759 | 42.132683 | 0.180514 | 18.05136 | 0.34396 | 12.527964 | 0.429469 | 12.183594 | 0.418052 | 35.339096 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge) |
|
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-72B-Instruct-abliterated | af94b3c05c9857dbac73afb1cbce00e4833ec9ef | 45.293139 | other | 9 | 72 | true | true | true | false | false | 18.809182 | 0.715261 | 71.526106 | 0.715226 | 59.912976 | 0.46148 | 46.148036 | 0.406879 | 20.917226 | 0.471917 | 19.122917 | 0.587184 | 54.131575 | false | 2024-10-01 | 2024-11-08 | 2 | Qwen/Qwen2.5-72B |
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zhengr/MixTAO-7Bx2-MoE-v8.1 | 828e963abf2db0f5af9ed0d4034e538fc1cf5f40 | 17.168311 | apache-2.0 | 54 | 12 | true | true | false | false | true | 0.92739 | 0.418781 | 41.878106 | 0.420194 | 19.176907 | 0.066465 | 6.646526 | 0.298658 | 6.487696 | 0.397625 | 8.303125 | 0.284658 | 20.517509 | false | 2024-02-26 | 2024-06-27 | 0 | zhengr/MixTAO-7Bx2-MoE-v8.1 |