Kquant03 commited on
Commit
f419869
1 Parent(s): 4f22e8f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -1
README.md CHANGED
@@ -23,7 +23,12 @@ The config looks like this...(detailed version is in the files and versions):
23
  - [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) - expert #3
24
  - [Kukedlc/Triunvirato-7b](https://huggingface.co/Kukedlc/Triunvirato-7b) - expert #4
25
 
26
- # Will upload to the huggingface leaderboard
 
 
 
 
 
27
 
28
 
29
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
 
23
  - [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) - expert #3
24
  - [Kukedlc/Triunvirato-7b](https://huggingface.co/Kukedlc/Triunvirato-7b) - expert #4
25
 
26
+ # Huge improvement upon the base Buttercup model!!!!
27
+
28
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZjNod8J9bmnhL9mM4znQv.png)
29
+
30
+ # Laser version is rank 2 in the world for roleplay.
31
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/L1AwFoaVbN-bO3CkuqW5Z.png)
32
 
33
 
34
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"