Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet

#3
by kosung - opened
No description provided.
czczup changed pull request status to merged
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment