Text Generation
PyTorch
causal-lm
rwkv
BlinkDL commited on
Commit
1067d68
1 Parent(s): 92d6286

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -41,20 +41,20 @@ The difference between World & Raven:
41
 
42
  NOTE: the new greedy tokenizer (https://github.com/BlinkDL/ChatRWKV/blob/main/tokenizer/rwkv_tokenizer.py) will tokenize '\n\n' as one single token instead of ['\n','\n']
43
 
44
- prompt:
45
  ```
46
  Instruction: xxx
47
  Input: xxx
48
  Response:
49
  ```
50
 
51
- A good chat prompt:
52
  ```
53
  Question: hi
54
 
55
  Answer: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
56
 
57
- Question: xxxxxx
58
 
59
  Answer:
60
  ```
 
41
 
42
  NOTE: the new greedy tokenizer (https://github.com/BlinkDL/ChatRWKV/blob/main/tokenizer/rwkv_tokenizer.py) will tokenize '\n\n' as one single token instead of ['\n','\n']
43
 
44
+ prompt (replace \n\n in xxx to \n):
45
  ```
46
  Instruction: xxx
47
  Input: xxx
48
  Response:
49
  ```
50
 
51
+ A good chat prompt (replace \n\n in xxx to \n):
52
  ```
53
  Question: hi
54
 
55
  Answer: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
56
 
57
+ Question: xxx
58
 
59
  Answer:
60
  ```