-
Notifications
You must be signed in to change notification settings - Fork 31.5k
Open
Labels
Description
System Info
Hi!
I don't have Flash Attention 2 installed, but I do have Flash Attention 3 installed. I get a versioning error while it should just skip FA2 and continue with 3:
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py", line 1386, in __init__
super().__init__(config)
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1277, in __init__
self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1791, in _check_and_adjust_attn_implementation
and not (is_flash_attn_2_available() or is_flash_attn_3_available())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 889, in is_flash_attn_2_available
return version.parse(flash_attn_version) >= version.parse("2.1.0")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/packaging/version.py", line 56, in parse
return Version(version)
^^^^^^^^^^^^^^^^
File "x/miniconda3/envs/animateai/lib/python3.11/site-packages/packaging/version.py", line 202, in __init__
raise InvalidVersion(f"Invalid version: {version!r}")
packaging.version.InvalidVersion: Invalid version: 'N/A'
N/A is the version set by transformers at the start of the search.
Who can help?
@vasqu @ArthurZucker @Cyrilvallez
Reproduction
Steps to reproduce:
- Only install flash attention 3
- Use dev code (installed from source)
Expected behavior
The code should run with FA3