-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Open
Labels
models[Component] Issues related to model support[Component] Issues related to model support
Milestone
Description
Please support OpenAI models (API) directly, as that opens up many options including ollama in a way compatible with opentelemetry. LiteLLM's telemetry support is callback based, so it requires manual setup, etc. If you used the OpenAI SDK or wrote direct HTTP calls, we could get better traces than we do today.
Right now, you can carefully re-route config to litellm, but it requires more dependencies and setup.
# LiteLLM uses different ENV variables for OpenAI and OpenTelemetry fields.
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
litellm.callbacks = [OpenTelemetry(otel_config)]
...
agent = Agent(name=app_name, model=LiteLlm(model="openai/"+model), ...
If the openai model used normal openai, it could use the normal openai instrumentation from opentelemetry with no programmatic setup
Lingelo, gmachena, someshfengde, majdsalman2, lometheus and 9 moregmacario, taktxAI, lometheus, stliu, zooyf and 1 more
Metadata
Metadata
Assignees
Labels
models[Component] Issues related to model support[Component] Issues related to model support