Skip to content

Commit 1b7624b

Browse files
[misc] Add FlashMLA as a new option of VLLM_ATTENTION_BACKEND env (#14267)
1 parent ac60dc7 commit 1b7624b

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

vllm/envs.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -321,6 +321,7 @@ def maybe_convert_int(value: Optional[str]) -> Optional[int]:
321321
# - "XFORMERS": use XFormers
322322
# - "ROCM_FLASH": use ROCmFlashAttention
323323
# - "FLASHINFER": use flashinfer
324+
# - "FLASHMLA": use FlashMLA
324325
"VLLM_ATTENTION_BACKEND":
325326
lambda: os.getenv("VLLM_ATTENTION_BACKEND", None),
326327

0 commit comments

Comments
 (0)