Skip to content

🐛 Bug Report: (OpenAI instrumentation) Streaming response traces and metrics lost when stream not fully consumed #3151

@minimAluminiumalism

Description

@minimAluminiumalism

Which component is this bug for?

OpenAI Instrumentation

📜 Description

I've observed this issue within the OpenAI instrumentation. Further testing is required to determine if it also affects ollama instrumentation

When using streaming chat completions, traces and metrics are only recorded if the stream is fully consumed (iterated to completion). If the stream object is created but not consumed, all traces and metrics data are lost.

Root cause

Metrics recording is triggered only in ChatStream.__next__() when StopIteration is raised:

def __next__(self):
    try:
        chunk = self.__wrapped__.__next__()
    except Exception as e:
        if isinstance(e, StopIteration):
            self._process_complete_response()  # ← Metrics recorded here
        raise

The ChatStream class lacks cleanup mechanisms (del, proper exit) to ensure metrics are recorded when the stream object is destroyed without full consumption.

👟 Reproduction steps

from traceloop.sdk import Traceloop
from openai import OpenAI

Traceloop.init(app_name="test", should_enrich_metrics=True)
client = OpenAI()

# Case 1: Stream not consumed - NO METRICS recorded
stream = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello"}],
    stream=True
)
# stream object created but not iterated - metrics lost

# Case 2: Stream consumed - metrics recorded correctly  
stream = client.chat.completions.create(
    model="gpt-3.5-turbo", 
    messages=[{"role": "user", "content": "Hello"}],
    stream=True
)
for chunk in stream:  # Full consumption triggers metrics
    pass

Case 1 produces zero metrics and traces. Case 2 produces both traces and metrics.

👍 Expected behavior

Traces and metrics should be recorded regardless of whether the streaming response is consumed by user code.

👎 Actual Behavior with Screenshots

Image

No metrics here.

🤖 Python Version

Python 3.12.1

📃 Provide any additional context for the Bug.

No response

👀 Have you spent some time to check if this bug has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions