Text Generation
Transformers
Safetensors
English
llama
conversational
Inference Endpoints
text-generation-inference

"Upstage" or "upstage"?

#14
by agershun - opened
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -119,9 +119,9 @@ Use the following Python code to load the model:
119
  import torch
120
  from transformers import AutoModelForCausalLM, AutoTokenizer
121
 
122
- tokenizer = AutoTokenizer.from_pretrained("Upstage/SOLAR-10.7B-Instruct-v1.0")
123
  model = AutoModelForCausalLM.from_pretrained(
124
- "Upstage/SOLAR-10.7B-Instruct-v1.0",
125
  device_map="auto",
126
  torch_dtype=torch.float16,
127
  )
 
119
  import torch
120
  from transformers import AutoModelForCausalLM, AutoTokenizer
121
 
122
+ tokenizer = AutoTokenizer.from_pretrained("upstage/SOLAR-10.7B-Instruct-v1.0")
123
  model = AutoModelForCausalLM.from_pretrained(
124
+ "upstage/SOLAR-10.7B-Instruct-v1.0",
125
  device_map="auto",
126
  torch_dtype=torch.float16,
127
  )