This VS Code extension allows you to chat with DeepSeek-R1 and other models locally, leveraging Ollama for AI-powered responses.
- Local AI Chat: Interact with DeepSeek-R1 directly in VS Code.
- Multiple Models: Choose between DeepSeek-R1, Qwen 2.5, and LLaMA 3.2.
- Real-time Streaming: Responses are streamed dynamically.
- Integrated UI: Clean and responsive chat interface.
Model Name | Description |
---|---|
deepseek-r1:1.5b |
DeepSeek R1 (1.5B params) |
qwen2.5:1.5b |
Qwen 2.5 (1.5B params) |
llama3.2:1b |
LLaMA 3.2 (1B params) |
- Install Ollama to run DeepSeek-R1 and other models locally.
- Build the extension by running:
npm install npm run compile
- Package the extension:
vsce package
- Install the .vsix file in VS Code:
* Open Extensions (Cmd+Shift+X). * Click ... (top-right) → Install from VSIX... * Select the generated .vsix file.
1. Open **Command Palette** (`Cmd+Shift+P` / `Ctrl+Shift+P`).
2. Run `"Start DeepSeek Chat"`.
3. Type your query and receive AI-generated responses.
npm install
npm run watch
code --extensionDevelopmentPath=.
npm run lint
npm test
This project is licensed under the **MIT License**.