xiaol's picture
Update README.md
6a5ed27 verified
|
raw
history blame
1.33 kB
---
license: apache-2.0
---
# Mobius Chat 12B 128K
## Introduction
Mobius is a RWKV v5.2 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data.
In comparison with the previous released Mobius, the improvements include:
* Only 24G Vram to run this model locally with fp16;
* Significant performance improvement;
* Multilingual support ;
* Stable support of 128K context length.
* Base model [Mobius-mega-12B-128k-base](https://huggingface.co/TimeMobius/Moibus-mega-12B-128k-base)
## Usage
We encourage you use few shots to use this model, Desipte Directly use User: xxxx\n\nAssistant: xxx\n\n is really good too, Can boost all potential ability.
## More details
Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed large language model which focus opensouce community
* 10~100 trainning/inference cost reduce;
* state based,selected memory, which mean good at grok;
* community support.
## requirements
24G vram to run fp16, 12G for int8, 6G for nf4 with Ai00 server.
* [RWKV Runner](https://github.com/josStorer/RWKV-Runner)
* [Ai00 server](https://github.com/cgisky1980/ai00_rwkv_server)
## future plan
If you need a HF version let us know
[Mobius-Chat-12B-128k](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)