Skip to content

[Feature]: support service_tier for o3 and o4-mini #10307

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
alexzeitgeist opened this issue Apr 25, 2025 · 1 comment
Open

[Feature]: support service_tier for o3 and o4-mini #10307

alexzeitgeist opened this issue Apr 25, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@alexzeitgeist
Copy link

The Feature

Hi,

Supporting the service_tier parameter ("flex") can help cut down costs by half using o3 or o4-mini.

See: https://platform.openai.com/docs/guides/flex-processing

Motivation, pitch

It helps to cut down costs by half, which is always nice :)

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

@alexzeitgeist alexzeitgeist added the enhancement New feature or request label Apr 25, 2025
@ZeroClover
Copy link

ZeroClover commented Apr 25, 2025

According to the document, any parameters not on this list are considered provider specific and will be passed directly to the LLM API.

So you can use this parameter directly. I have tried it, and it works fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants