Hello, I would like to ask, can I use this model if it does not support flash-attention? When I install flash-attention, I keep getting an error.
· Sign up or log in to comment