Is flash-attention-2 suppported

#8
by Jack7777777 - opened
Alibaba-NLP org

Hi, xformers could dispatch to flash-attention-2 kernel. Hence I commented out this extra entrance.

Sign up or log in to comment