An intelligent command-line interface that leverages AI to help you interact with codebases and work on them similar to Gemini cli and Claude code.
Cool-Code is a powerful tool that combines the capabilities of large language models with a comprehensive set of development tools to provide an interactive database and code management experience. Simply describe what you want to accomplish, and the AI agent will understand your intent and execute the necessary operations.
Demo of it spinning up a Node Express server with Prisma:
Cool.Code.Demo.VId.mp4
- Natural Language Processing: Interact with your codebase using plain English
- Intelligent Code Analysis: Understands your project structure and coding patterns. No vector Dbs needed.
- File Operations: Read, create, edit and search files with AI assistance
- Shell Command Execution: Run system commands through the AI agent
- Real-time Streaming: Get live feedback as the AI processes your requests
- Context-Aware: Maintains conversation history and project context.
- Node.js (v16 or higher)
- A Google AI API key for Gemini
Install globally from npm:
npm install -g cool-code
Set your Google AI API key:
export GOOGLE_GENERATIVE_AI_API_KEY=your_api_key_here
- Clone the repository:
git clone https://github.com/rushikeshg25/cool-code.git
cd cool-code
- Install dependencies:
npm install
- Set up environment variables:
cp .env.example .env
Edit .env
and add your Google AI API key:
GOOGLE_GENERATIVE_AI_API_KEY=your_api_key_here
- Build the project:
npm run build
- Link for local development:
npm link
Navigate to your project's dir
cool-code
The main entry point that initializes the CLI using Commander.js and starts the interactive session.
- Landing (
landing.ts
): Displays the welcome screen with ASCII art - Query Handler (
query.ts
): Manages user input and query processing - Spinner (
spinner.ts
): Provides visual feedback during processing
- Processor (
processor.ts
): Main orchestrator that handles query processing - LLM (
llm.ts
): Manages communication with Google's Gemini AI model - Context Manager (
contextManager.ts
): Maintains conversation history and project state - Prompts (
prompts.ts
): Contains system prompts and examples for the AI
A comprehensive set of tools that the AI can use:
-
File Operations:
readFileTool.ts
: Read file contentseditTool.ts
: Edit existing filesnewFileTool.ts
: Create new files
-
Search & Discovery:
globTool.ts
: Find files using glob patternsgrepTool.ts
: Search file contents using regex
-
System Operations:
shellTool.ts
: Execute shell commandsignoreGitIgnoreFileTool.ts
: Handle .gitignore patterns
-
Tool Management:
tool-registery.ts
: Registry of all available toolstoolValidator.ts
: Validates and executes tool calls
User starts CLI β Landing screen β Query input prompt
User Query β Context Manager β LLM Processing β Tool Selection β Tool Execution β Response
- User Input: User enters a natural language query
- Context Building: The Context Manager builds a comprehensive prompt including:
- System instructions
- Project file structure
- Available tools
- Conversation history
- AI Processing: The LLM (Gemini) processes the prompt and decides which tools to use
- Tool Execution: Selected tools are validated and executed in sequence
- Response Generation: Results are formatted and presented to the user
- Context Update: The conversation history is updated for future queries
The AI uses a sophisticated tool selection system:
// Example tool call structure
[
{
tool: 'read_file',
description: 'Reading current server configuration',
toolOptions: {
absolutePath: '/src/server.js',
},
},
{
tool: 'edit_file',
description: 'Adding new middleware',
toolOptions: {
filePath: '/src/server.js',
oldString: "const express = require('express');\nconst app = express();",
newString:
"const express = require('express');\nconst auth = require('./auth');\nconst app = express();",
},
},
];
GOOGLE_GENERATIVE_AI_API_KEY
: Your Google AI API key for Gemini
The system uses Google's Gemini 2.5 Flash model by default. You can modify the configuration in src/core/processor.ts
:
this.config = {
LLMConfig: {
model: 'gemini-2.5-flash',
},
// ... other config
};
- Path Validation: All file operations use absolute paths to prevent directory traversal
- Git Integration: Respects .gitignore patterns to avoid sensitive files
- Tool Validation: All tool calls are validated before execution
- Context Limits: Conversation history is limited to prevent token overflow
- Error Handling: Comprehensive error handling with user-friendly messages
cool-code/
βββ src/
β βββ core/ # Core engine
β β βββ tools/ # Tool implementations
β β βββ utils/ # Utility functions
β β βββ contextManager.ts
β β βββ llm.ts
β β βββ processor.ts
β β βββ prompts.ts
β βββ types/ # TypeScript type definitions
β βββ ui/ # User interface components
β βββ index.ts # Entry point
βββ dist/ # Compiled output
βββ package.json
βββ tsconfig.json
βββ README.md