ArmelR commited on
Commit
7a4a4fe
1 Parent(s): 32cb54d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -28,7 +28,7 @@ For example, your prompt can look like
28
  instruction = "Write a function to compute the GCD between two integers a and b"
29
  prompt = f"Question:{instruction}\n\nAnswer:"
30
  input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"]
31
- completion = model.generate(input_ids, max_length=100)
32
- print(tokenizer.decode(completion[len(input_ids):])[0])
33
  ```
34
 
 
28
  instruction = "Write a function to compute the GCD between two integers a and b"
29
  prompt = f"Question:{instruction}\n\nAnswer:"
30
  input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"]
31
+ completion = model.generate(input_ids, max_length=200)
32
+ print(tokenizer.batch_decode(completion[:,input_ids.shape[1]:])[0])
33
  ```
34