Skip to content

Getting Codex Working with Ollama #864

Closed Answered by daveManDaveDude
daveManDaveDude asked this question in Q&A
Discussion options

You must be logged in to vote

I have answered my own question. I had gone directly from using the default openAI model to a local model on my Mac mini. My expectations were a little off :-) I Eventually I managed to get a local model to write a 1 line hello world in python after a lot of thinking, and trying to work out how to write a file.

So my answer to myself is Codex CLI is fine, you just need a much more powerful model!

Thanks so much to the entire team and community for codex and all things open source

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by daveManDaveDude
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant