kuotient commited on
Commit
57e6bb0
โ€ข
1 Parent(s): f3dd720

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -63,6 +63,7 @@ It follows only **ChatML** format.
63
  ```
64
 
65
  #### Example code
 
66
  Since, chat_template already contains insturction format above.
67
  You can use the code below.
68
  ```python
@@ -71,7 +72,8 @@ device = "cuda" # the device to load the model onto
71
  model = AutoModelForCausalLM.from_pretrained("kuotient/Seagull-13B-translation")
72
  tokenizer = AutoTokenizer.from_pretrained("kuotient/Seagull-13B-translation")
73
  messages = [
74
- {"role": "user", "content": "๋ฐ”๋‚˜๋‚˜๋Š” ์›๋ž˜ ํ•˜์–€์ƒ‰์ด์•ผ?"},
 
75
  ]
76
  encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
77
 
 
63
  ```
64
 
65
  #### Example code
66
+ **I highly recommend to inference model with vllm. I will write a guide for quick and easy inference if requested.**
67
  Since, chat_template already contains insturction format above.
68
  You can use the code below.
69
  ```python
 
72
  model = AutoModelForCausalLM.from_pretrained("kuotient/Seagull-13B-translation")
73
  tokenizer = AutoTokenizer.from_pretrained("kuotient/Seagull-13B-translation")
74
  messages = [
75
+ {"role": "system", "content", "์ฃผ์–ด์ง„ ๋ฌธ์žฅ์„ ํ•œ๊ตญ์–ด๋กœ ๋ฒˆ์—ญํ•˜์„ธ์š”."}
76
+ {"role": "user", "content": "Here are five examples of nutritious foods to serve your kids."},
77
  ]
78
  encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
79