ArmelR commited on
Commit
32cb54d
1 Parent(s): 52181bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -28,7 +28,7 @@ For example, your prompt can look like
28
  instruction = "Write a function to compute the GCD between two integers a and b"
29
  prompt = f"Question:{instruction}\n\nAnswer:"
30
  input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"]
31
- completion = model.generate(input_ids)
32
  print(tokenizer.decode(completion[len(input_ids):])[0])
33
  ```
34
 
 
28
  instruction = "Write a function to compute the GCD between two integers a and b"
29
  prompt = f"Question:{instruction}\n\nAnswer:"
30
  input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"]
31
+ completion = model.generate(input_ids, max_length=100)
32
  print(tokenizer.decode(completion[len(input_ids):])[0])
33
  ```
34