Update README.md
Browse files
README.md
CHANGED
@@ -123,7 +123,7 @@ This model is mainly used for large model technology experiments, and increasing
|
|
123 |
| --- | --- |
|
124 |
|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1y2XmAGrQvVfbgtimTsCBO3tem735q7HZ?usp=sharing) | MixTAO-7Bx2-MoE-v8.1 |
|
125 |
|[mixtao-7bx2-moe-v8.1.Q4_K_M.gguf](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF/resolve/main/mixtao-7bx2-moe-v8.1.Q4_K_M.gguf) | GGUF of MixTAO-7Bx2-MoE-v8.1 <br> Only Q4_K_M in https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF |
|
126 |
-
| Demo Space | https://zhengr-
|
127 |
|
128 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
129 |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-v8.1)
|
|
|
123 |
| --- | --- |
|
124 |
|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1y2XmAGrQvVfbgtimTsCBO3tem735q7HZ?usp=sharing) | MixTAO-7Bx2-MoE-v8.1 |
|
125 |
|[mixtao-7bx2-moe-v8.1.Q4_K_M.gguf](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF/resolve/main/mixtao-7bx2-moe-v8.1.Q4_K_M.gguf) | GGUF of MixTAO-7Bx2-MoE-v8.1 <br> Only Q4_K_M in https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF |
|
126 |
+
| Demo Space | https://huggingface.co/spaces/zhengr/MixTAO-7Bx2-MoE-v8.1/ |
|
127 |
|
128 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
129 |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-v8.1)
|