Support for AI models for Edit Predictions #30255
Replies: 1 comment
-
+1 From me this was one of the first things I searched for in the documentation of Edit Predictions. What I found was that you can choose between Copilot, Supermaven and the Zeta Model. What I would love to be able to do is alongside to the hosted Models from Anthropic, OpenAi etc. use some localy hosted models for example the new model from Jetbrains Mellum which seems lightweight enougth to run it alongside in ollama. https://huggingface.co/JetBrains/Mellum-4b-base I thought about something like: "features": {
"edit_prediction_provider": {
"type": "ollama",
"model": "lazarevtill/Mellum-4b-base"
}
} Where the default could be like: "features": {
"edit_prediction_provider": "zed"
} And then you could do the same for LM Studio: "features": {
"edit_prediction_provider": {
"type": "LM Studio",
"model": "Model name"
}
} In the agent panel you already can connect to Ollama etc. and I found something relating to changing the model for the inline assist inside the Agent settings "agent": {
"version": "2",
"inline_assistant_model": null,
"default_model": null
} this does not have any available auto completions on my end but I think I could and should provide the model names here. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to request support for real time edit predictions, code completions & suggestions like how zeta provides for other language models like Gemini, OpenAI .
Beta Was this translation helpful? Give feedback.
All reactions