Transformers
PyTorch
English
whisper
Eval Results
Inference Endpoints
patrickvonplaten commited on
Commit
b0cf192
1 Parent(s): 567801d

Correct long-form generation config parameters 'max_initial_timestamp_index' and 'prev_sot_token_id'.

Browse files

Hey MU-NLPC 👋,

Your model repository seems to contain outdated generation config parameters, such as 'max_initial_timestamp_index' and is missing the 'prev_sot_token_id' parameter. These parameters need to be updated to correctly handle long-form generation as stated in as part of https://github.com/huggingface/transformers/pull/27658. This PR makes sure that everything is up to date and can be safely merged.

Best, the Transformers team.

Files changed (1) hide show
  1. generation_config.json +2 -1
generation_config.json CHANGED
@@ -127,10 +127,11 @@
127
  "<|zh|>": 50260
128
  },
129
  "language": "english",
130
- "max_initial_timestamp_index": 1,
131
  "max_length": 448,
132
  "no_timestamps_token_id": 50363,
133
  "pad_token_id": 50257,
 
134
  "return_timestamps": false,
135
  "suppress_tokens": [
136
  1,
 
127
  "<|zh|>": 50260
128
  },
129
  "language": "english",
130
+ "max_initial_timestamp_index": 50,
131
  "max_length": 448,
132
  "no_timestamps_token_id": 50363,
133
  "pad_token_id": 50257,
134
+ "prev_sot_token_id": 50361,
135
  "return_timestamps": false,
136
  "suppress_tokens": [
137
  1,