please add Usage
#1
by
shibing624
- opened
predict with transformers code.
+1
Hi, please refer to our github repo for example code:
How to prevent the warning below?
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask
to obtain reliable results.
Setting pad_token_id
to eos_token_id
:11 for open-end generation.
How to prevent the warning below?
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input'sattention_mask
to obtain reliable results.
Settingpad_token_id
toeos_token_id
:11 for open-end generation.
It could be simply ignored, I guess