lysandre HF staff amyeroberts HF staff commited on
Commit
3f5c25d
1 Parent(s): 8c7b107

Update README.md (#28)

Browse files

- Update README.md (32aceabbd9ade9e3dae08965a4d373431d29e186)


Co-authored-by: Amy Roberts <amyeroberts@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -54,8 +54,8 @@ You can use this model directly with a pipeline for text generation.
54
  >>> from transformers import pipeline
55
 
56
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b")
57
- >>> generator("Hello, I'm am conscious and")
58
- [{'generated_text': 'Hello, I am conscious and I am here.\nI am here.\nI am conscious.'}]
59
  ```
60
 
61
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
@@ -65,8 +65,8 @@ By default, generation is deterministic. In order to use the top-k sampling, ple
65
 
66
  >>> set_seed(32)
67
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True)
68
- >>> generator("Hello, I'm am conscious and")
69
- [{'generated_text': "Hello, I'm am conscious and able to hear. I have a lot of experience in the"}]
70
  ```
71
 
72
  ### Limitations and bias
 
54
  >>> from transformers import pipeline
55
 
56
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b")
57
+ >>> generator("What are we having for dinner?")
58
+ [{'generated_text': 'What are we having for dinner?\nI'm not sure. I'm not a chef. I'}]
59
  ```
60
 
61
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
 
65
 
66
  >>> set_seed(32)
67
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True)
68
+ >>> generator("What are we having for dinner?")
69
+ [{'generated_text': "What are we having for dinner?\nI have had chicken and rice for lunch. It is delicious"}]
70
  ```
71
 
72
  ### Limitations and bias