xiaol commited on
Commit
0e173ca
1 Parent(s): 72f81f5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +43 -0
README.md CHANGED
@@ -1,3 +1,46 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ ---
5
+ license: apache-2.0
6
+ ---
7
+ # Model url: https://huggingface.co/TimeMobius/Mobius-RWKV-r5-chat-12B-8k
8
+ Considering the long context required for training from scratch, we decided to retrain the r5 12B model from 8k.
9
+ This model exhibits lower diversity compared to its predecessor, but it excels in following instructions and logical understanding. It is possible to utilize both models simultaneously as multi-agents, each performing a different task.
10
+
11
+ # Mobius RWKV r5 chat 12B 8k
12
+ Mobius is a RWKV v5.2 arch chat model, benifit from [Matrix-Valued States and Dynamic Recurrence](https://arxiv.org/abs/2404.05892)
13
+
14
+ ## Introduction
15
+
16
+ Mobius is a RWKV v5.2 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data.
17
+ In comparison with the previous released Mobius, the improvements include:
18
+
19
+ * Only 24G Vram to run this model locally with fp16;
20
+ * Significant performance improvement;
21
+ * Multilingual support ;
22
+ * Stable support of 128K context length.
23
+ * Base model [Mobius-mega-12B-128k-base](https://huggingface.co/TimeMobius/Moibus-mega-12B-128k-base)
24
+
25
+
26
+ ## Usage
27
+ We encourage you use few shots to use this model, Desipte Directly use User: xxxx\n\nAssistant: xxx\n\n is really good too, Can boost all potential ability.
28
+
29
+ Recommend Temp and topp: 0.7 0.6/1 0.3/1.5 0.3/0.2 0.8
30
+
31
+ ## More details
32
+ Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed large language model which focus opensouce community
33
+ * 10~100 trainning/inference cost reduce;
34
+ * state based,selected memory, which mean good at grok;
35
+ * community support.
36
+
37
+ ## requirements
38
+ 24G vram to run fp16, 12G for int8, 6G for nf4 with Ai00 server.
39
+
40
+ * [RWKV Runner](https://github.com/josStorer/RWKV-Runner)
41
+ * [Ai00 server](https://github.com/cgisky1980/ai00_rwkv_server)
42
+
43
+ ## future plan
44
+ If you need a HF version let us know
45
+
46
+ [Mobius-Chat-12B-128k](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)