Fix the issue on latest transformers update
Browse filesFor the first iteration where we we are using `input_embeds` to start the generation, the conditional statement fails as the past_key_values now does not return none but returns `Dynamic_Cache()` according to the latest transformers version. Using `input_ids
- modeling_phi.py +1 -1
modeling_phi.py
CHANGED
@@ -1161,7 +1161,7 @@ class PhiForCausalLM(PhiPreTrainedModel):
|
|
1161 |
position_ids = position_ids[:, -input_ids.shape[1] :]
|
1162 |
|
1163 |
# if `inputs_embeds` are passed, we only want to use them in the 1st generation step
|
1164 |
-
if inputs_embeds is not None and
|
1165 |
model_inputs = {"inputs_embeds": inputs_embeds}
|
1166 |
else:
|
1167 |
model_inputs = {"input_ids": input_ids}
|
|
|
1161 |
position_ids = position_ids[:, -input_ids.shape[1] :]
|
1162 |
|
1163 |
# if `inputs_embeds` are passed, we only want to use them in the 1st generation step
|
1164 |
+
if inputs_embeds is not None and (input_ids is None or input_ids.shape[1] == 0):
|
1165 |
model_inputs = {"inputs_embeds": inputs_embeds}
|
1166 |
else:
|
1167 |
model_inputs = {"input_ids": input_ids}
|