Skip to content

Conversation

@adobrzyn
Copy link
Collaborator

@adobrzyn adobrzyn commented Nov 6, 2025

If VLLM_ENABLE_EXPERIMENTAL_FLAGS is set to 0 or not set warmup will stay hidden with only progress bar.
Enabling this flag will bring back old logs

Additionally remove VLLM_USE_V1 flag

Additionally all user flags are no longer experimental

Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
@github-actions
Copy link

github-actions bot commented Nov 6, 2025

🚧 CI Blocked

The main CI workflow was not started for the following reason:

Your branch is behind the base branch. Please merge or rebase to get the latest changes.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR improves the user experience during model warmup by hiding verbose logs behind a developer flag and adding progress bars. When VLLM_ENABLE_EXPERIMENTAL_FLAGS is disabled (default), users see only progress bars. Enabling the flag restores detailed logging for developers. The PR also removes the deprecated VLLM_USE_V1 flag and reclassifies user flags as no longer experimental.

Key changes:

  • Warmup logs now hidden by default, replaced with tqdm progress bars
  • VLLM_USE_V1 flag removed from user flags
  • User flags excluded from experimental flags list

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.

File Description
vllm_gaudi/v1/worker/hpu_model_runner.py Adds tqdm progress bars for warmup phases and gates detailed logging behind VLLM_ENABLE_EXPERIMENTAL_FLAGS
vllm_gaudi/extension/runtime.py Filters user flags out of experimental flags list to prevent duplicate categorization
vllm_gaudi/extension/features.py Removes deprecated VLLM_USE_V1 flag from user flags list

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
@github-actions
Copy link

github-actions bot commented Nov 6, 2025

🚧 CI Blocked

The main CI workflow was not started for the following reason:

Your branch is behind the base branch. Please merge or rebase to get the latest changes.

adobrzyn and others added 2 commits November 6, 2025 15:44
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
@github-actions
Copy link

github-actions bot commented Nov 6, 2025

✅ CI Passed

All checks passed successfully against the following vllm commit:
0384aa7150c4c9778efca041ffd1beb3ad2bd694

@github-actions
Copy link

github-actions bot commented Nov 7, 2025

✅ CI Passed

All checks passed successfully against the following vllm commit:
0384aa7150c4c9778efca041ffd1beb3ad2bd694

@github-actions
Copy link

✅ CI Passed

All checks passed successfully against the following vllm commit:
2dacd5739409847e91299e7747a142e200fdff6c

adobrzyn and others added 3 commits November 14, 2025 09:35
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
| `VLLM_EXPONENTIAL_BUCKETING` | Enables exponential bucket spacing instead of linear spacing. | `true` |
| `VLLM_BUCKETING_FROM_FILE` | Enables reading bucket configuration from file | `None` |

## Experimental Parameters
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is mention of "Experimenta" still in the code:

"From v0.12.0 release using those flags without VLLM_ENABLE_EXPERIMENTAL_FLAGS will trigger a fatal error.")

should it be renamed?

@github-actions
Copy link

✅ CI Passed

All checks passed successfully against the following vllm commit:
da14ae0fad3165b88fcdc03a8f59f1813f8e832a

@adobrzyn adobrzyn merged commit 3163498 into main Nov 14, 2025
38 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants