The call to LLM comes back with empty message at sometimes.
It's not making any tool calls or giving any response - comes back back with empty response.
this works sometimes without any issues.
here is the code snippet which uses the OpenAIResponseAgent's InvokeStreamingAsync
await foreach (var streamingContent in openAiAgent.InvokeStreamingAsync(streamContent, agentThread, agentInvokeOptions, cancellationToken: cts.Token))