eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.08k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 522
values | Submission Date
stringclasses 262
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
zelk12_MT4-Max-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B | 25e64938f38ed3db0113007a2814b069fd2952b0 | 22.332038 | gemma | 0 | 10.159 | true | false | false | true | 3.9168 | 0.177079 | 17.707904 | 0.612013 | 44.173982 | 0.095166 | 9.516616 | 0.35151 | 13.534676 | 0.422802 | 11.383594 | 0.439079 | 37.675458 | true | false | 2025-01-11 | 2025-01-11 | 1 | zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_MT4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-gemma-2-9B | 2167ea02baf9145a697a7d828a17c75b86e5e282 | 34.026402 | 0 | 10.159 | false | false | false | true | 6.310517 | 0.776161 | 77.616059 | 0.607314 | 43.553827 | 0.208459 | 20.845921 | 0.338087 | 11.744966 | 0.430927 | 12.999219 | 0.436586 | 37.398419 | false | false | 2024-10-16 | 2024-10-20 | 1 | zelk12/MT4-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen1-gemma-2-9B | 0291b776e80f38381788cd8f1fb2c3435ad891b5 | 34.440432 | 0 | 10.159 | false | false | false | true | 4.034506 | 0.78313 | 78.312987 | 0.611048 | 44.183335 | 0.221299 | 22.129909 | 0.347315 | 12.975391 | 0.420385 | 11.614844 | 0.436835 | 37.426123 | false | false | 2024-10-25 | 2024-10-31 | 1 | zelk12/MT5-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen2-gemma-2-9B | 3ee2822fcba6708bd9032b79249a2789e5996b6a | 34.55155 | 1 | 10.159 | false | false | false | true | 3.716761 | 0.796244 | 79.624397 | 0.610541 | 44.113215 | 0.220544 | 22.054381 | 0.35151 | 13.534676 | 0.416292 | 10.436458 | 0.437916 | 37.546173 | false | false | 2024-11-23 | 2024-11-23 | 1 | zelk12/MT5-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen3-gemma-2-9B | 4b3811c689fec5c9cc483bb1ed696734e5e88fcf | 34.488645 | 0 | 10.159 | false | false | false | true | 3.874666 | 0.78253 | 78.253035 | 0.609049 | 43.885913 | 0.216767 | 21.676737 | 0.35151 | 13.534676 | 0.423052 | 12.08151 | 0.4375 | 37.5 | false | false | 2024-12-08 | 2024-12-08 | 1 | zelk12/MT5-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen4-gemma-2-9B | 2f826d76460a5b7f150622a57f2d5419adfc464f | 34.658891 | gemma | 0 | 10.159 | true | false | false | true | 3.643441 | 0.783455 | 78.345457 | 0.613106 | 44.323211 | 0.22432 | 22.432024 | 0.353188 | 13.758389 | 0.422833 | 11.354167 | 0.439661 | 37.7401 | true | false | 2024-12-20 | 2024-12-20 | 1 | zelk12/MT5-Gen4-gemma-2-9B (Merge) |
zelk12_MT5-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen5-gemma-2-9B | d1f68652d7dda810da8207a371d26376c6a6e847 | 34.634253 | gemma | 2 | 10.159 | true | false | false | true | 3.783291 | 0.79472 | 79.472023 | 0.611166 | 44.115081 | 0.225831 | 22.583082 | 0.348154 | 13.087248 | 0.419115 | 11.55599 | 0.432929 | 36.992095 | true | false | 2024-12-29 | 2024-12-29 | 1 | zelk12/MT5-Gen5-gemma-2-9B (Merge) |
zelk12_MT5-Max-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B | a90f9ca13af28c72695fabc56da4ddd8e3a8e173 | 22.353757 | gemma | 0 | 10.159 | true | false | false | true | 4.124419 | 0.176155 | 17.615482 | 0.612679 | 44.274407 | 0.098187 | 9.818731 | 0.35151 | 13.534676 | 0.422771 | 11.213021 | 0.438996 | 37.666223 | true | false | 2025-01-14 | 2025-01-14 | 1 | zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_MT5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-gemma-2-9B | b627ae7d796b1ae85b59c55e0e043b8d3ae73d83 | 34.773049 | 0 | 10.159 | false | false | false | true | 6.53966 | 0.804787 | 80.478685 | 0.611223 | 44.271257 | 0.225831 | 22.583082 | 0.343121 | 12.416107 | 0.420385 | 11.48151 | 0.436669 | 37.407654 | false | false | 2024-10-19 | 2024-10-21 | 1 | zelk12/MT5-gemma-2-9B (Merge) |
|
zelk12_MTM-Merge-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MTM-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTM-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTM-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MTM-Merge-gemma-2-9B | 843f23c68cf50f5bdc0206f93e72ce0f9feeca6e | 34.614985 | gemma | 2 | 10.159 | true | false | false | true | 3.586692 | 0.779808 | 77.980758 | 0.613335 | 44.380677 | 0.217523 | 21.752266 | 0.354866 | 13.982103 | 0.426771 | 11.946354 | 0.43883 | 37.647754 | true | false | 2025-01-01 | 2025-01-01 | 1 | zelk12/MTM-Merge-gemma-2-9B (Merge) |
zelk12_MTMaMe-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTMaMe-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B | ce68b2468bcba0c5dcde79bbf5346db81f069b12 | 22.385497 | gemma | 0 | 10.159 | true | false | false | true | 3.795679 | 0.178603 | 17.860277 | 0.611679 | 44.160463 | 0.095921 | 9.592145 | 0.352349 | 13.646532 | 0.424104 | 11.479687 | 0.438165 | 37.573877 | true | false | 2025-01-16 | 2025-01-16 | 1 | zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_Rv0.4DMv1t0.25-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Rv0.4DMv1t0.25-gemma-2-9B | 23e7337dabbf023177c25ded4923286a2e3936fc | 34.114018 | 0 | 10.159 | false | false | false | true | 3.837289 | 0.749658 | 74.965758 | 0.606971 | 43.664764 | 0.225831 | 22.583082 | 0.345638 | 12.751678 | 0.430927 | 12.932552 | 0.440076 | 37.786274 | false | false | 2024-12-31 | 2024-12-31 | 1 | zelk12/Rv0.4DMv1t0.25-gemma-2-9B (Merge) |
|
zelk12_Rv0.4DMv1t0.25Tt0.25-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25Tt0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B | 28fbcc2fa23f46aaaed327984784251527c78815 | 33.877515 | gemma | 0 | 10.159 | true | false | false | true | 3.825166 | 0.76462 | 76.46201 | 0.609786 | 43.914819 | 0.206949 | 20.694864 | 0.342282 | 12.304251 | 0.428292 | 12.703125 | 0.434674 | 37.186022 | true | false | 2024-12-31 | 2024-12-31 | 1 | zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B (Merge) |
zelk12_Rv0.4MT4g2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Rv0.4MT4g2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4MT4g2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4MT4g2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Rv0.4MT4g2-gemma-2-9B | ef595241d2c62203c27d84e6643d384a7cf99bd4 | 33.255962 | gemma | 1 | 10.159 | true | false | false | true | 3.706507 | 0.732022 | 73.202215 | 0.60412 | 43.199046 | 0.194864 | 19.486405 | 0.353188 | 13.758389 | 0.423083 | 11.91875 | 0.441739 | 37.970966 | true | false | 2025-01-04 | 2025-01-04 | 1 | zelk12/Rv0.4MT4g2-gemma-2-9B (Merge) |
zelk12_T31122024203920-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/T31122024203920-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/T31122024203920-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__T31122024203920-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/T31122024203920-gemma-2-9B | 25cb58c73a3adf43cee33b50238b1d332b5ccc13 | 34.209071 | gemma | 0 | 10.159 | true | false | false | true | 3.732738 | 0.767618 | 76.76177 | 0.609563 | 43.728997 | 0.205438 | 20.543807 | 0.350671 | 13.422819 | 0.432198 | 13.32474 | 0.437251 | 37.472296 | true | false | 2024-12-31 | 2024-12-31 | 1 | zelk12/T31122024203920-gemma-2-9B (Merge) |
zelk12_Test01012025155054_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Test01012025155054 | c607186b0b079975e3305e0223e0a55f0cbc19e5 | 3.591417 | 0 | 3.817 | false | false | false | true | 1.400948 | 0.155523 | 15.55229 | 0.28295 | 1.280547 | 0 | 0 | 0.241611 | 0 | 0.367021 | 3.710937 | 0.109043 | 1.004728 | false | false | 2025-01-01 | 2025-01-01 | 1 | zelk12/Test01012025155054 (Merge) |
|
zelk12_Test01012025155054t0.5_gemma-2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054t0.5_gemma-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054t0.5_gemma-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054t0.5_gemma-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Test01012025155054t0.5_gemma-2 | 14fcae0d420d303df84bd9b9c8744a6f0fa147fb | 3.591417 | 0 | 3.817 | false | false | false | true | 1.395928 | 0.155523 | 15.55229 | 0.28295 | 1.280547 | 0 | 0 | 0.241611 | 0 | 0.367021 | 3.710937 | 0.109043 | 1.004728 | false | false | 2025-01-01 | 2025-01-01 | 1 | zelk12/Test01012025155054t0.5_gemma-2 (Merge) |
|
zelk12_gemma-2-S2MTM-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/gemma-2-S2MTM-9B | fd6860743943114eeca6fc2e800e27c87873bcc5 | 33.89283 | gemma | 0 | 10.159 | true | false | false | true | 3.530205 | 0.782256 | 78.225553 | 0.606084 | 43.115728 | 0.204683 | 20.468278 | 0.345638 | 12.751678 | 0.421844 | 12.163802 | 0.429688 | 36.631944 | true | false | 2024-12-11 | 2024-12-11 | 1 | zelk12/gemma-2-S2MTM-9B (Merge) |
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 | b4208ddf6c741884c16c77b9433d9ead8f216354 | 33.919919 | 2 | 10.159 | false | false | false | true | 6.886383 | 0.764895 | 76.489492 | 0.607451 | 43.706516 | 0.228097 | 22.809668 | 0.349832 | 13.310962 | 0.413625 | 10.303125 | 0.432098 | 36.899749 | false | false | 2024-10-03 | 2024-10-03 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | e652c9e07265526851dad994f4640aa265b9ab56 | 34.282119 | 1 | 10.159 | false | false | false | true | 6.389981 | 0.770665 | 77.066517 | 0.607543 | 43.85035 | 0.214502 | 21.450151 | 0.343121 | 12.416107 | 0.43226 | 13.132552 | 0.439993 | 37.777039 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 | eb0e589291630ba20328db650f74af949d217a97 | 31.782789 | 0 | 10.159 | false | false | false | true | 7.502906 | 0.720806 | 72.080635 | 0.59952 | 42.487153 | 0.201662 | 20.166163 | 0.349832 | 13.310962 | 0.395115 | 7.75599 | 0.414063 | 34.895833 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 | 76f56b25bf6d8704282f8c77bfda28ca384883bc | 33.626064 | 1 | 10.159 | false | false | false | true | 6.827351 | 0.759999 | 75.999902 | 0.606626 | 43.633588 | 0.22281 | 22.280967 | 0.348154 | 13.087248 | 0.410958 | 9.836458 | 0.432264 | 36.918218 | false | false | 2024-10-07 | 2024-10-11 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge) |
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 | 1e3e623e9f0b386bfd967c629dd39c87daef5bed | 33.904825 | 1 | 10.159 | false | false | false | true | 9.69897 | 0.761523 | 76.152276 | 0.609878 | 43.941258 | 0.20997 | 20.996979 | 0.341443 | 12.192394 | 0.431021 | 13.310937 | 0.431516 | 36.835106 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 | 8af6620b39c9a36239879b6b2bd88f66e9e9d930 | 34.406991 | 0 | 10.159 | false | false | false | true | 9.808856 | 0.794396 | 79.439554 | 0.60644 | 43.39057 | 0.220544 | 22.054381 | 0.35151 | 13.534676 | 0.420229 | 11.095313 | 0.432347 | 36.927453 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 | ced039b03be6f65ac0f713efcee76c6534e65639 | 32.586531 | 1 | 10.159 | false | false | false | true | 6.264441 | 0.744537 | 74.453672 | 0.597759 | 42.132683 | 0.188822 | 18.882175 | 0.34396 | 12.527964 | 0.429469 | 12.183594 | 0.418052 | 35.339096 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge) |
|
zetasepic_Qwen2.5-32B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-32B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-32B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-32B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-32B-Instruct-abliterated-v2 | 5894fbf0a900e682dfc0ed794db337093bd8d26b | 46.888673 | apache-2.0 | 14 | 32.764 | true | false | false | true | 13.489578 | 0.833413 | 83.341312 | 0.693402 | 56.533818 | 0.595166 | 59.516616 | 0.36745 | 15.659955 | 0.435427 | 14.928385 | 0.562168 | 51.35195 | false | false | 2024-10-11 | 2024-12-07 | 2 | Qwen/Qwen2.5-32B |
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-72B-Instruct-abliterated | af94b3c05c9857dbac73afb1cbce00e4833ec9ef | 46.337953 | other | 28 | 72.706 | true | false | false | false | 37.618363 | 0.715261 | 71.526106 | 0.715226 | 59.912976 | 0.524169 | 52.416918 | 0.406879 | 20.917226 | 0.471917 | 19.122917 | 0.587184 | 54.131575 | false | false | 2024-10-01 | 2024-11-08 | 2 | Qwen/Qwen2.5-72B |
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zhengr/MixTAO-7Bx2-MoE-v8.1 | 828e963abf2db0f5af9ed0d4034e538fc1cf5f40 | 17.067606 | apache-2.0 | 55 | 12.879 | true | true | false | true | 1.85478 | 0.418781 | 41.878106 | 0.420194 | 19.176907 | 0.060423 | 6.042296 | 0.298658 | 6.487696 | 0.397625 | 8.303125 | 0.284658 | 20.517509 | false | false | 2024-02-26 | 2024-06-27 | 0 | zhengr/MixTAO-7Bx2-MoE-v8.1 |
Subsets and Splits