Unexpected keyword argument 'cache_position'

#16
by lmurillo - opened

I am experimenting issues when trying to use this model. I got the following error message:
internlm2_attention_forward() got an unexpected keyword argument 'cache_position'

I tried different transformers versions like 4.37.1, 4.38.2 or 4.41.2 without success. Does anyone have seen this error before?

Sign up or log in to comment