Enable flash_attention_2 support since the underlying Mistral model supports it

#3
by winglian - opened
No description provided.
lievan changed pull request status to merged
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment