-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
otel: fixes dep on grpc when not using it and honors opentelemetry-instrument #9972
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…strument Signed-off-by: Adrian Cole <adrian.cole@elastic.co>
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Note: I ran out of time today and didn't test this in a 3rd party project to verify everything works outside unit and mock tests. If no changes requested by tomorrow (singapore) and no one else tried this (e.g. by changing their requirement to a git ref of my branch with |
OK I verified manually, and this works like a champ @ishaan-jaff. Can you review for merge? Here's my test script import os
import litellm
from litellm import completion
from litellm.integrations.opentelemetry import OpenTelemetry
# LiteLLM uses old ENV until https://github.com/BerriAI/litellm/issues/7829
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
litellm.callbacks = [OpenTelemetry()]
def main():
messages = [
{
"role": "user",
"content": "Answer in up to 3 words: Which ocean contains Bouvet Island?",
},
]
model = os.getenv("CHAT_MODEL", "gpt-4o-mini")
chat_completion = completion(model="openai/" + model, messages=messages)
print(chat_completion["choices"][0]["message"]["content"])
if __name__ == "__main__":
main() I ran with
In otel-tui I saw this trace: And here's my whole project diff, noting I don't need to explicitly include otel grpc anymore diff --git a/Dockerfile b/Dockerfile
index 254ced4..3218da5 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -1,10 +1,16 @@
# Use glibc-based image with pre-compiled wheels for grpcio and tiktoken
FROM python:3.12-slim
+# TODO: temporary until https://github.com/BerriAI/litellm/pull/9972
+RUN apt-get update \
+ && apt-get install -y --no-install-recommends git \
+ && apt-get clean \
+ && rm -rf /var/lib/apt/lists/*
+
RUN python -m pip install --upgrade pip
COPY /requirements.txt /tmp/requirements.txt
RUN pip install -r /tmp/requirements.txt
COPY main.py /
-CMD [ "python", "main.py" ]
+CMD [ "opentelemetry-instrument", "python", "main.py" ] diff --git a/README.md b/README.md
index 13ce749..b3f6594 100644
--- a/README.md
+++ b/README.md
@@ -20,7 +20,7 @@ python3 -m venv .venv
source .venv/bin/activate
pip install "python-dotenv[cli]"
pip install -r requirements.txt
-dotenv run -- python main.py
+dotenv run -- opentelemetry-instrument python main.py
## Notes diff --git a/main.py b/main.py
index 358fafb..c38bed8 100644
--- a/main.py
+++ b/main.py
@@ -2,14 +2,12 @@ import os
import litellm
from litellm import completion
-from litellm.integrations.opentelemetry import OpenTelemetry, OpenTelemetryConfig
+from litellm.integrations.opentelemetry import OpenTelemetry
# LiteLLM uses old ENV until https://github.com/BerriAI/litellm/issues/7829
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
-# LiteLLM uses custom ENV until https://github.com/BerriAI/litellm/issues/9901
-otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
-otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
-litellm.callbacks = [OpenTelemetry(otel_config)]
+
+litellm.callbacks = [OpenTelemetry()]
def main(): diff --git a/requirements.txt b/requirements.txt
index 5d3b9da..4a48ff5 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,8 +1,7 @@
# current versions are missing packages if you only use 'litellm'
-litellm[proxy]~=1.65.6
+# litellm[proxy]~=1.65.6
+litellm[proxy] @ git+https://github.com/codefromthecrypt/litellm.git@otel-no-override
opentelemetry-sdk~=1.32.0
opentelemetry-exporter-otlp-proto-http~=1.32.0
-# dependency even though it's not used
-opentelemetry-exporter-otlp-proto-grpc~=1.32.0
opentelemetry-distro~=0.52b0 |
@ishaan-jaff I'm trying to nag my way through your 100 issue backlog.. is it working? ;) |
@CLAassistant just signed in another PR! |
@ishaan-jaff you keen on this one? |
@@ -115,6 +115,16 @@ def __init__( | |||
otel_exporter_logger = logging.getLogger("opentelemetry.sdk.trace.export") | |||
otel_exporter_logger.setLevel(logging.DEBUG) | |||
|
|||
# Don't override a tracer provider already set by the user | |||
if trace.get_tracer_provider() is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this makes sure the opentelemetry-instrument supplied tracer is in use
@ishaan-jaff @aabmass this is the first step in tighter otel integration and less impactful than changing the callbacks. This allows Once this is done, we can discuss something a bit more dynamic which is having the callbacks run in a way that traces are captured as they happen instead of after-the-fact (now they are recorded after the request). This would be more of a change, but it would allow litellm to be better integrated in applications which have other otel concerns. For example, this would ensure HTTP client spans or MCP spans to end up in the right trace. For now, let's get this part in? |
Title
This does two things around otel:
auto-configuration means prefixing python with
opentelemetry-instrument
or adding entrypoint code like this. e.g. via EDOTRelevant issues
Fixes #9901
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
)[https://docs.litellm.ai/docs/extras/contributing_code]Type
🆕 New Feature
Changes
reorganized code in opentelemetry.py so that LiteLLM ENV variables are in one place only read in config. If an existing tracer provider exists, avoid registering a new one. Some code to re-organize where specific exporter types are imported.
cc @xrmx @anuraaga