Back-end & Supported Models List #18
nimishchaudhari
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Ollama
Kindly comment down the models that you've tested and worked successfully
Working models:
(Only Tool calling models are supported at the moment, )
Non - working (yet hopefully) models:
**Models that do not support tool calling do not work with nanocoder (yet) non tool calling support in progress #21 **
´´´
Please wait, AI is thinking...
file:///home/nimish/.nvm/versions/node/v22.18.0/lib/node_modules/@motesoftware/nanocoder/node_modules/ollama/dist/browser.mjs:73
throw new ResponseError(message, response.status);
^
ResponseError: registry.ollama.ai/library/qwen3-coder:latest does not support tools
at checkOk (file:///home/nimish/.nvm/versions/node/v22.18.0/lib/node_modules/@motesoftware/nanocoder/node_modules/ollama/dist/browser.mjs:73:9)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async post (file:///home/nimish/.nvm/versions/node/v22.18.0/lib/node_modules/@motesoftware/nanocoder/node_modules/ollama/dist/browser.mjs:137:3)
at async Ollama.processStreamableRequest (file:///home/nimish/.nvm/versions/node/v22.18.0/lib/node_modules/@motesoftware/nanocoder/node_modules/ollama/dist/browser.mjs:256:25)
at async OllamaClient.chatStream (file:///home/nimish/.nvm/versions/node/v22.18.0/lib/node_modules/@motesoftware/nanocoder/dist/ollama-client.js:60:24) {
error: 'registry.ollama.ai/library/qwen3-coder:latest does not support tools',
status_code: 400
}
´´´
Please comment down to report the models that worked / didn't work for you, we wish to maintain a list for recommended models for the community.
Cheers,
Nimish
Beta Was this translation helpful? Give feedback.
All reactions