Kquant03 commited on
Commit
d4fa544
1 Parent(s): 48778a9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -18,8 +18,9 @@ I was approached with the idea to make a merge based on story telling, and consi
18
 
19
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
20
 
21
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/V4cv6tthy1quRRMCvAf2H.png)
22
- # The model performed slightly better than base mixtral instruct in erotic roleplay variety on [Ayumi's benchmark](http://ayumi.m8geil.de/ayumi_bench_v3_results.html) and #29/818 BEST FRANKENMOE overall!!!
 
23
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
24
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
25
 
 
18
 
19
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
20
 
21
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/xXXJhZNJ4q3suxJ9LyLqK.png)
22
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZpX1KMNYj11k4pF0NX3Q9.png)
23
+ It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
24
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
25
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
26