chargoddard commited on
Commit
fc26ce0
1 Parent(s): 0aa0b36

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -0
README.md ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ ---
4
+
5
+ # Mixtraln't 4x7B
6
+
7
+ Oh boy, a new model architecture in Transformers! Time to do profane things with it.
8
+
9
+ What if instead of training a MoE from scratch, we took some pre-trained Mistral models and shoved them in a little clown car? Let's find out.
10
+
11
+
12
+ Uses parts from the following models:
13
+ * [Q-bert/MetaMath-Cybertron-Starling](https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling)
14
+ * [NeverSleep/Noromaid-7b-v0.1.1](https://huggingface.co/NeverSleep/Noromaid-7b-v0.1.1)
15
+ * [teknium/Mistral-Trismegistus-7B](https://huggingface.co/teknium/Mistral-Trismegistus-7B)
16
+ * [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B)
17
+ * [PocketDoc/Dans-AdventurousWinds-Mk2-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b)
18
+
19
+
20
+ Works and generates coherent text. The big question here is if the hack I used to populate the MoE gates works well enough to take advantage of all of the experts. Let's find out!
21
+
22
+ Prompt format: maybe alpaca??? or chatml??? life is full of mysteries