Skip to content

otel: fixes dep on grpc when not using it and honors opentelemetry-instrument #9972

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

codefromthecrypt
Copy link
Contributor

Title

This does two things around otel:

  • moves import of http or grpc to where they are used, to eliminate a dep on grpcio when not using it
  • checks first before registering a tracer provider, so that otel auto-configuration can be visible

auto-configuration means prefixing python with opentelemetry-instrument or adding entrypoint code like this. e.g. via EDOT

     from opentelemetry.instrumentation import auto_instrumentation
     auto_instrumentation.initialize()

Relevant issues

Fixes #9901

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on (make test-unit)[https://docs.litellm.ai/docs/extras/contributing_code]
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Screenshot 2025-04-14 at 4 51 27 PM

Type

🆕 New Feature

Changes

reorganized code in opentelemetry.py so that LiteLLM ENV variables are in one place only read in config. If an existing tracer provider exists, avoid registering a new one. Some code to re-organize where specific exporter types are imported.

cc @xrmx @anuraaga

…strument

Signed-off-by: Adrian Cole <adrian.cole@elastic.co>
Copy link

vercel bot commented Apr 14, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 14, 2025 10:53pm

@codefromthecrypt
Copy link
Contributor Author

Note: I ran out of time today and didn't test this in a 3rd party project to verify everything works outside unit and mock tests. If no changes requested by tomorrow (singapore) and no one else tried this (e.g. by changing their requirement to a git ref of my branch with opentelemetry-instrument), I will then.

Signed-off-by: Adrian Cole <adrian.cole@elastic.co>
@codefromthecrypt
Copy link
Contributor Author

OK I verified manually, and this works like a champ @ishaan-jaff. Can you review for merge?

Here's my test script

import os

import litellm
from litellm import completion
from litellm.integrations.opentelemetry import OpenTelemetry

# LiteLLM uses old ENV until https://github.com/BerriAI/litellm/issues/7829
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")

litellm.callbacks = [OpenTelemetry()]


def main():
    messages = [
        {
            "role": "user",
            "content": "Answer in up to 3 words: Which ocean contains Bouvet Island?",
        },
    ]
    model = os.getenv("CHAT_MODEL", "gpt-4o-mini")
    chat_completion = completion(model="openai/" + model, messages=messages)
    print(chat_completion["choices"][0]["message"]["content"])


if __name__ == "__main__":
    main()

I ran with opentelemetry-instrument with the below env:

# Override default ENV variables for Ollama
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_API_KEY=unused
CHAT_MODEL=qwen2.5:0.5b

# OpenTelemetry configuration
OTEL_SERVICE_NAME=python-litellm-openai
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf

In otel-tui I saw this trace:
Screenshot 2025-04-15 at 6 50 22 AM

And here's my whole project diff, noting I don't need to explicitly include otel grpc anymore

diff --git a/Dockerfile b/Dockerfile
index 254ced4..3218da5 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -1,10 +1,16 @@
 # Use glibc-based image with pre-compiled wheels for grpcio and tiktoken
 FROM python:3.12-slim
 
+# TODO: temporary until https://github.com/BerriAI/litellm/pull/9972
+RUN apt-get update \
+    && apt-get install -y --no-install-recommends git \
+    && apt-get clean \
+    && rm -rf /var/lib/apt/lists/*
+
 RUN python -m pip install --upgrade pip
 COPY /requirements.txt /tmp/requirements.txt
 RUN pip install -r /tmp/requirements.txt
 
 COPY main.py /
-CMD [ "python", "main.py" ]
+CMD [ "opentelemetry-instrument", "python", "main.py" ]
diff --git a/README.md b/README.md
index 13ce749..b3f6594 100644
--- a/README.md
+++ b/README.md
@@ -20,7 +20,7 @@ python3 -m venv .venv
 source .venv/bin/activate
 pip install "python-dotenv[cli]"
 pip install -r requirements.txt
-dotenv run -- python main.py
+dotenv run -- opentelemetry-instrument python main.py
 
 ## Notes
diff --git a/main.py b/main.py
index 358fafb..c38bed8 100644
--- a/main.py
+++ b/main.py
@@ -2,14 +2,12 @@ import os
 
 import litellm
 from litellm import completion
-from litellm.integrations.opentelemetry import OpenTelemetry, OpenTelemetryConfig
+from litellm.integrations.opentelemetry import OpenTelemetry
 
 # LiteLLM uses old ENV until https://github.com/BerriAI/litellm/issues/7829
 os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
-# LiteLLM uses custom ENV until https://github.com/BerriAI/litellm/issues/9901
-otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
-otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
-litellm.callbacks = [OpenTelemetry(otel_config)]
+
+litellm.callbacks = [OpenTelemetry()]
 
 
 def main():
diff --git a/requirements.txt b/requirements.txt
index 5d3b9da..4a48ff5 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,8 +1,7 @@
 # current versions are missing packages if you only use 'litellm'
-litellm[proxy]~=1.65.6
+# litellm[proxy]~=1.65.6
+litellm[proxy] @ git+https://github.com/codefromthecrypt/litellm.git@otel-no-override
 
 opentelemetry-sdk~=1.32.0
 opentelemetry-exporter-otlp-proto-http~=1.32.0
-# dependency even though it's not used
-opentelemetry-exporter-otlp-proto-grpc~=1.32.0
 opentelemetry-distro~=0.52b0

@codefromthecrypt
Copy link
Contributor Author

@ishaan-jaff I'm trying to nag my way through your 100 issue backlog.. is it working? ;)

@CLAassistant
Copy link

CLAassistant commented Apr 22, 2025

CLA assistant check
All committers have signed the CLA.

@codefromthecrypt
Copy link
Contributor Author

@CLAassistant just signed in another PR!

@codefromthecrypt
Copy link
Contributor Author

@ishaan-jaff you keen on this one?

@@ -115,6 +115,16 @@ def __init__(
otel_exporter_logger = logging.getLogger("opentelemetry.sdk.trace.export")
otel_exporter_logger.setLevel(logging.DEBUG)

# Don't override a tracer provider already set by the user
if trace.get_tracer_provider() is None:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this makes sure the opentelemetry-instrument supplied tracer is in use

@codefromthecrypt
Copy link
Contributor Author

@ishaan-jaff @aabmass this is the first step in tighter otel integration and less impactful than changing the callbacks. This allows opentelemetry-instrument to configure the tracer used in logging callbacks.

Once this is done, we can discuss something a bit more dynamic which is having the callbacks run in a way that traces are captured as they happen instead of after-the-fact (now they are recorded after the request). This would be more of a change, but it would allow litellm to be better integrated in applications which have other otel concerns. For example, this would ensure HTTP client spans or MCP spans to end up in the right trace.

For now, let's get this part in?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: integrate with default opentelemetry exporter configured by opentelemetry-instrument
4 participants