no apply_rotary_emb function

#64
by galazzo - opened

Running the model on Windows with installed flash_attn==2.7.1.post4 and transformers==4.56.0.dev0 I get the error

from transformers.modeling_flash_attention_utils import apply_rotary_emb, flash_attn_varlen_func
ImportError: cannot import name 'apply_rotary_emb' from 'transformers.modeling_flash_attention_utils'

Actually I double checked the transformers package and do not exists into transformers.modeling_flash_attention_utils

I'm forced to uninstall flash_attn to make the code working

Jina AI org

Thank you for reporting the issue. This might be due to some recent changes in the transformers library. We will investigate this, we when we find the time for it. Meanwhile you could try it with an older version, e.g., 4.52.0 still had the apply_rotary_emb function: https://github.com/huggingface/transformers/blob/113424bcd53b92600f77d82f48add0a60fb41556/src/transformers/modeling_flash_attention_utils.py#L38

It should be fixed now: https://huggingface.co/jinaai/jina-embeddings-v4/discussions/67 Can you try it again?

Sign up or log in to comment