Kquant03 commited on
Commit
77bcd52
1 Parent(s): efc7a03

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -16,6 +16,7 @@ My first 8x7B frankenMoE...aimed to incorporate everything I've learned, so far.
16
  - [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #6
17
  - [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #7
18
  - [SanjiWatsuki/Lelantos-DPO-7B](https://huggingface.co/SanjiWatsuki/Lelantos-DPO-7B) - expert #8
 
19
 
20
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
21
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
 
16
  - [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #6
17
  - [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #7
18
  - [SanjiWatsuki/Lelantos-DPO-7B](https://huggingface.co/SanjiWatsuki/Lelantos-DPO-7B) - expert #8
19
+
20
 
21
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
22
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)