Skip to content

[Bug]: openai_harmony.HarmonyError: unexpected tokens remaining in message header #23567

@MoellerAI

Description

@MoellerAI

Your current environment

I keep hitting this error “ openai_harmony.HarmonyError: unexpected tokens remaining in message header”

In multi-turn conversations when using gpt-oss-120b and with both vllm v0.10.1 and v0.10.1.1. I use the official docker image. Seems like a lot of users have this problem. Please also see this issue: openai/harmony#38

But I haven’t found the problem at users not using vllm, so might very well be a vllm problem.

🐛 Describe the bug

I keep hitting this error “ openai_harmony.HarmonyError: unexpected tokens remaining in message header”

In multi-turn conversations when using gpt-oss-120b and with both vllm v0.10.1 and v0.10.1.1. I use the official docker image. Seems like a lot of users have this problem. Please also see this issue: openai/harmony#38

But I haven’t found the problem at users not using vllm, so might very well be a vllm problem.

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggpt-ossRelated to GPT-OSS models

    Type

    No type

    Projects

    Status

    To Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions