Skip to content

fmueller/jarvis

Repository files navigation

Jarvis

Build Version Downloads

About

Jarvis is an LLM-powered developer plugin for the JetBrains IDE platform. It aims to support developers by leveraging local LLMs only. To achieve this, it is integrating with Ollama. Jarvis keeps the currently used model in memory for five minutes to reduce loading times.

Installation

  1. Install and run Ollama
  2. Install Jarvis plugin in your JetBrains IDE:
    • Using the IDE built-in plugin system: Settings/Preferences > Plugins > Marketplace > Search for "jarvis" > Install
    • Manually: Download the latest release and install it manually using Settings/Preferences > Plugins > ⚙️ > Install plugin from disk...

Usage

Jarvis can be controlled via chat messages and commands. To start a conversation, simply type /new in the chat window.

Available commands:

  • /help or /? - Shows this help message
  • /new - Starts a new conversation
  • /plain - Sends a chat message without code context
  • /copy - Copies the conversation to the clipboard
  • /model <modelName> - Changes the model to use (model name default is qwen3:1.7b)
  • /model or /model-info - Shows the info card of the current model
  • /model set -<parameter> <value> - Configures model inference parameters
  • /host <host> - Sets the Ollama host (host default is http://localhost:11434)

When using reasoning models with Ollama, Jarvis shows their internal thoughts in an expandable section at the top of each answer.

License

This project is licensed under the MIT.

About

LLM-powered developer plugin for JetBrains IDE platform

Resources

License

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •  

Languages