LiteLLM / OpenAI Codex Discussion and Support #10156
ishaan-jaff
started this conversation in
General
Replies: 1 comment 4 replies
-
Thanks @ishaan-jaff for starting this thread. Here are the steps I followed (after installing codex with npm) to set up Codex with LiteLLM on Docker(Podman, in my case). 1. Fetched LiteLLM Image
2. Created a LiteLLM config file(I defined this in
3. Assigned API keys in environment variables in
|
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Starting this thread for support questions using OpenAI codex with LiteLLM
Tutorial using Codex with LiteLLM: https://docs.litellm.ai/docs/tutorials/openai_codex
Steps to Use
1. Install OpenAI Codex CLI:
2. Configure model routing in litellm_config.yaml:
3. Start LiteLLM Proxy:
4. Point Codex to LiteLLM Proxy (via Env Variables)
We point codex to litellm as a base url and set the api key to the litellm virtual key
5. Run Codex with your preferred model:
Beta Was this translation helpful? Give feedback.
All reactions