ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils'

#16
by nayeon212 - opened
ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils'

In the latest version of transformers, the function is_flash_attn_available() has been removed and replaced by is_flash_attn_2_available(). It would be great if you could update the modeling_orion.py code to use is_flash_attn_2_available() instead of is_flash_attn_available(). Alternatively, it would be helpful if you could provide a working transformers library (version transformers < 4.35) that includes the required functionality.

Thanks a lot!

OrionStarAI org

fixed! thanks!

Sign up or log in to comment