Jarvis is an LLM-powered developer plugin for the JetBrains IDE platform. It aims to support developers by leveraging local LLMs only. To achieve this, it is integrating with Ollama. Jarvis keeps the currently used model in memory for five minutes to reduce loading times.
- Install and run Ollama
- Install Jarvis plugin in your JetBrains IDE:
- Using the IDE built-in plugin system: Settings/Preferences > Plugins > Marketplace > Search for "jarvis" > Install
- Manually: Download the latest release and install it manually using Settings/Preferences > Plugins > ⚙️ > Install plugin from disk...
Jarvis can be controlled via chat messages and commands. To start a conversation, simply type /new
in the chat window.
Available commands:
/help
or/?
- Shows this help message/new
- Starts a new conversation/plain
- Sends a chat message without code context/copy
- Copies the conversation to the clipboard/model <modelName>
- Changes the model to use (model namedefault
isqwen3:1.7b
)/model
or/model-info
- Shows the info card of the current model/model set -<parameter> <value>
- Configures model inference parameters/host <host>
- Sets the Ollama host (hostdefault
ishttp://localhost:11434
)
When using reasoning models with Ollama, Jarvis shows their internal thoughts in an expandable section at the top of each answer.
This project is licensed under the MIT.