-
-
Notifications
You must be signed in to change notification settings - Fork 11.6k
Open
Labels
bugSomething isn't workingSomething isn't workinggpt-ossRelated to GPT-OSS modelsRelated to GPT-OSS models
Description
Your current environment
I keep hitting this error “ openai_harmony.HarmonyError: unexpected tokens remaining in message header”
In multi-turn conversations when using gpt-oss-120b and with both vllm v0.10.1 and v0.10.1.1. I use the official docker image. Seems like a lot of users have this problem. Please also see this issue: openai/harmony#38
But I haven’t found the problem at users not using vllm, so might very well be a vllm problem.
🐛 Describe the bug
I keep hitting this error “ openai_harmony.HarmonyError: unexpected tokens remaining in message header”
In multi-turn conversations when using gpt-oss-120b and with both vllm v0.10.1 and v0.10.1.1. I use the official docker image. Seems like a lot of users have this problem. Please also see this issue: openai/harmony#38
But I haven’t found the problem at users not using vllm, so might very well be a vllm problem.
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
kelliaao, iSevenDays, aleksanderhan, qandrew, Ithanil and 13 morekelliaao, epark001 and Braxtyn-DeGolyer-SHG
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinggpt-ossRelated to GPT-OSS modelsRelated to GPT-OSS models
Type
Projects
Status
To Triage