Skip to content

Support OpenAI models (API) directly #27

@codefromthecrypt

Description

@codefromthecrypt

Please support OpenAI models (API) directly, as that opens up many options including ollama in a way compatible with opentelemetry. LiteLLM's telemetry support is callback based, so it requires manual setup, etc. If you used the OpenAI SDK or wrote direct HTTP calls, we could get better traces than we do today.

Right now, you can carefully re-route config to litellm, but it requires more dependencies and setup.

# LiteLLM uses different ENV variables for OpenAI and OpenTelemetry fields.
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
litellm.callbacks = [OpenTelemetry(otel_config)]

...

agent = Agent(name=app_name, model=LiteLlm(model="openai/"+model), ...

If the openai model used normal openai, it could use the normal openai instrumentation from opentelemetry with no programmatic setup

Metadata

Metadata

Assignees

Labels

modelsIssues about model support

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions