Allow using OpenAI Pro and Plus plan's for an api provider #6990
tholum
started this conversation in
Feature Requests
Replies: 2 comments
-
|
As far my understanding is, this is against the TOS of OpenAI and Codex. This is similar to Gemini CLI. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Yup! Waiting this feature too, very need |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Codex is open source, and it supports using your pro and plus plans for the api provider, it should help keep the api pricing down, especially now that ChatGPT models are usable for coding
Current Plan ( Yes ai generated )
Proposal: Add “Sign in with ChatGPT (Plus/Pro)” authentication for the OpenAI provider
This document outlines a compliant, testable plan to let users authenticate Roo Code with their ChatGPT Plus/Pro account instead of manually pasting an API key. It follows project policies in
CONTRIBUTING.md,CODE_OF_CONDUCT.md, andSECURITY.md, and integrates cleanly with the existing OpenAI provider implementation described inREADME.md.Why
Scope (MVP)
"apiKey" | "chatgpt".http://127.0.0.1:<port>/auth/callback, exchange code for tokens, then perform token-exchange to retrieve an “openai-api-key”.SecretStorage; wire the OpenAI provider to read from SecretStorage whenauthMode === "chatgpt".Out of scope (future): enterprise SSO variants; multi-account switching UI.
High-level design
UX additions
Roo: Sign in with ChatGPT (OpenAI)Roo: Sign out ChatGPT (OpenAI)Roo: Import OpenAI credentials from Codex CLIRoo: Paste Codex auth.jsonData model (SecretStorage keys)
roo.openai.chatgpt.apiKey: exchanged OpenAI API key (Bearer token)roo.openai.chatgpt.idToken: OAuth ID token (JWT)roo.openai.chatgpt.refreshToken: OAuth refresh tokenroo.openai.chatgpt.lastRefreshIso: ISO timestampConfiguration
openAi.authMode:"apiKey" | "chatgpt"(default remains"apiKey")Request path
src/api/providers/openai*.tsrequest logic. WhenauthMode === "chatgpt", we read the API key from SecretStorage and pass it as usual.API endpoints used (explicit)
codex-mini-latest):POST https://api.openai.com/v1/responsesAuthorization: Bearer <apiKey>,Content-Type: application/json,Accept: text/event-stream(for streaming)POST https://api.openai.com/v1/chat/completionsAuthorization: Bearer <apiKey>,Content-Type: application/json(SDK handles streaming headers)openAiNativeBaseUrl(defaulthttps://api.openai.com) foropenai-nativehandler.openAiBaseUrl(defaulthttps://api.openai.com/v1) foropenaihandler.OpenAI-Organization,OpenAI-Project.https://chatgpt.com/backend-api/codex; the “drop‑in” behavior is limited to the authentication flow and token‑exchange semantics.OAuth + token-exchange flow (mirrors Codex CLI)
Start local server
127.0.0.1on port 1455 by default; if occupied, pick a random free port.stateand PKCEcode_verifier/code_challenge(S256).Open browser to authorization URL
https://auth.openai.com/oauth/authorizeresponse_type=codeclient_id=app_EMoamEEZ73f0CkXaXp7hrannredirect_uri=http://localhost:1455/auth/callbackscope=openid profile email offline_accesscode_challenge+code_challenge_method=S256id_token_add_organizations=truecodex_cli_simplified_flow=truestateHandle callback on
/auth/callbackstate.POST https://auth.openai.com/oauth/tokenwithgrant_type=authorization_code,code,redirect_uri,client_id,code_verifier.id_token,access_token,refresh_tokenin SecretStorage.Token exchange → API key
id_tokenclaims include organization/project (or allowed for personal), request an API key via token-exchange:grant_type=urn:ietf:params:oauth:grant-type:token-exchangerequested_token=openai-api-keysubject_token=<id_token>subject_token_type=urn:ietf:params:oauth:token-type:id_tokenclient_id=<RooClientId>roo.openai.chatgpt.apiKeyin SecretStorage.Optional: complimentary credit redemption (best-effort)
https://api.openai.com/v1/billing/redeem_creditswith theid_tokenwhen Plus/Pro and eligible. Errors are logged as warnings only.Finish
Refresh policy
id_tokenis expiring (or older than ~28 days). If so, refresh viaPOST /oauth/tokenwithgrant_type=refresh_token, rotate tokens, and optionally re-run token-exchange to rotate the API key if required. UpdatelastRefreshIso.Headless/remote support
ssh -L 1455:localhost:1455 <host>, or copy/paste the printed URL to a local browser.Implementation plan (issue-first, small PRs)
Provider wiring (small PR)
openAi.authModeconfig and readSecretStorageforchatgptmode.apiKeymode.UI actions (small PR)
OAuth helper (medium PR)
env.openExternal.Token-exchange + storage (medium PR)
roo.openai.chatgpt.apiKey.Refresh + best-effort credit redemption (small PR)
Import from Codex CLI (small PR)
~/.codex/auth.jsonexists, parse and importOPENAI_API_KEYand tokens to SecretStorage (user confirmation required).auth.json. Parse in-memory and discard the raw text after success.OPENAI_API_KEYortokens.access_tokenand atokens.id_token.OPENAI_API_KEY→roo.openai.chatgpt.apiKey(if present)tokens.id_token→roo.openai.chatgpt.idTokentokens.refresh_token→roo.openai.chatgpt.refreshTokenlast_refresh→roo.openai.chatgpt.lastRefreshIso(if present)openAi.authModeto"chatgpt"and update status.Tests & docs (small PR)
Rollout
Security & privacy
vscode.SecretStorage; never log raw tokens or keys.state; use PKCE S256.SECURITY.mdfor responsible handling and disclosure.Community/process compliance
CONTRIBUTING.md.CODE_OF_CONDUCT.md.Compatibility constraints (drop‑in replacement for Codex CLI)
app_EMoamEEZ73f0CkXaXp7hrann.1455and the redirect URIhttp://localhost:1455/auth/callback.id_token_add_organizations=trueandcodex_cli_simplified_flow=true.openai-api-key.Risks and mitigations
Beta Was this translation helpful? Give feedback.
All reactions