Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,11 @@ tags:
|
|
16 |
|
17 |
This is an update to the original Cognitive Fusion. We intend to perform a fine-tune on it in order to increase its performance.
|
18 |
|
19 |
-
- [
|
20 |
-
- [
|
21 |
-
- [
|
22 |
-
- [
|
23 |
-
- [
|
24 |
|
25 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
26 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
|
|
16 |
|
17 |
This is an update to the original Cognitive Fusion. We intend to perform a fine-tune on it in order to increase its performance.
|
18 |
|
19 |
+
- [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B) - base
|
20 |
+
- [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B) - expert #1
|
21 |
+
- [liminerity/M7-7b](https://huggingface.co/liminerity/M7-7b) - expert #2
|
22 |
+
- [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B) - expert #3
|
23 |
+
- [nlpguy/T3QM7](https://huggingface.co/nlpguy/T3QM7) - expert #4
|
24 |
|
25 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
26 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|