Skip to content

Support OpenAI models (API) directly #27

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
codefromthecrypt opened this issue Apr 10, 2025 · 3 comments
Open

Support OpenAI models (API) directly #27

codefromthecrypt opened this issue Apr 10, 2025 · 3 comments
Labels
models Issues about model support

Comments

@codefromthecrypt
Copy link

codefromthecrypt commented Apr 10, 2025

Please support OpenAI models (API) directly, as that opens up many options including ollama in a way compatible with opentelemetry. LiteLLM's telemetry support is callback based, so it requires manual setup, etc. If you used the OpenAI SDK or wrote direct HTTP calls, we could get better traces than we do today.

Right now, you can carefully re-route config to litellm, but it requires more dependencies and setup.

# LiteLLM uses different ENV variables for OpenAI and OpenTelemetry fields.
os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
litellm.callbacks = [OpenTelemetry(otel_config)]

...

agent = Agent(name=app_name, model=LiteLlm(model="openai/"+model), ...

If the openai model used normal openai, it could use the normal openai instrumentation from opentelemetry with no programmatic setup

@codefromthecrypt
Copy link
Author

cc @aabmass in case interested

tutumomo pushed a commit to tutumomo/adk-python that referenced this issue Apr 26, 2025
* Enhance README for CrewAI Agent with A2A Protocol:

- Added detailed explanation of functionality
- Setup instructions, features, and limitations.
- Included a sequence diagram for clarity on agent interactions
- improved formatting for better readability.

* Update README for LangGraph Currency Agent:

- Expanded overview of the currency conversion agent and its functionality.
- Added detailed setup instructions and technical implementation details.
- Included key features and limitations of the agent.
- Enhanced formatting and added a sequence diagram for better understanding of interactions.

* Remove instructions for simulated streaming in CrewAI README

* Remove references to streaming

---------

Co-authored-by: kthota-g <kcthota@google.com>
@aabmass
Copy link
Member

aabmass commented Apr 30, 2025

Thanks for opening the discussion @codefromthecrypt. I'm wondering if this topic has been brought up with LiteLLM as well. If they had cleaner support for OTel and followed the semantic conventions, is there any benefit of using OpenAI SDK directly?

@codefromthecrypt
Copy link
Author

@aabmass made a comment here about the status quo as I understand it! BerriAI/litellm#9972 (comment)

@boyangsvl boyangsvl added the models Issues about model support label May 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
models Issues about model support
Projects
None yet
Development

No branches or pull requests

3 participants