dmayhem93 commited on
Commit
1fd5d0e
1 Parent(s): a834036

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -9,7 +9,7 @@ language:
9
  - en
10
  pipeline_tag: text-generation
11
  ---
12
- # FreeWilly
13
 
14
  ## Model Description
15
 
@@ -25,7 +25,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
25
 
26
  tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga_13B", use_fast=False)
27
  model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga_13B", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
28
- system_prompt = "### System:\nYou are Free Willy, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
29
 
30
  message = "Write me a poem please"
31
  prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
@@ -35,7 +35,7 @@ output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_t
35
  print(tokenizer.decode(output[0], skip_special_tokens=True))
36
  ```
37
 
38
- FreeWilly should be used with this prompt format:
39
  ```
40
  ### System:
41
  This is a system prompt, please behave and help the user.
@@ -44,7 +44,7 @@ This is a system prompt, please behave and help the user.
44
  Your prompt here
45
 
46
  ### Assistant
47
- The output of FreeWilly2
48
  ```
49
 
50
  ## Model Details
 
9
  - en
10
  pipeline_tag: text-generation
11
  ---
12
+ # StableBeluga_13B
13
 
14
  ## Model Description
15
 
 
25
 
26
  tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga_13B", use_fast=False)
27
  model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga_13B", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
28
+ system_prompt = "### System:\nYou are StableBeluga_13B, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
29
 
30
  message = "Write me a poem please"
31
  prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
 
35
  print(tokenizer.decode(output[0], skip_special_tokens=True))
36
  ```
37
 
38
+ StableBeluga_13B should be used with this prompt format:
39
  ```
40
  ### System:
41
  This is a system prompt, please behave and help the user.
 
44
  Your prompt here
45
 
46
  ### Assistant
47
+ The output of StableBeluga_13B
48
  ```
49
 
50
  ## Model Details