Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,8 @@ tags:
|
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
# What is it? A 2x7B MoE model for Roleplay(?).
|
8 |
-
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
|
|
9 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
10 |
- udkai/Turdus
|
11 |
- Kquant03/Samlagast-7B-laser-bf16
|
|
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
# What is it? A 2x7B MoE model for Roleplay(?).
|
8 |
+
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
9 |
+
# You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard
|
10 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
11 |
- udkai/Turdus
|
12 |
- Kquant03/Samlagast-7B-laser-bf16
|