Skip to content

juandape/project-code-namer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

12 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Project Code Namer

npm version License: MIT TypeScript

A powerful tool to analyze and improve variable and function names in JavaScript/TypeScript projects using AI-powered suggestions and predefined naming rules.

๐Ÿš€ Quick Start

  1. Install the package:

    # With npm
    npm install --save-dev project-code-namer
    
    # With yarn
    yarn add --dev project-code-namer
  2. Set up AI (recommended):

    # Install Ollama (free local AI)
    brew install ollama && brew services start ollama
    
    # Run setup script
    yarn setup-ollama  # or npm run setup-ollama
  3. Start analyzing:

    yarn project-code-namer  # or npx project-code-namer

That's it! The tool will guide you through the rest.

๐Ÿš€ Features

  • ๐Ÿง  Smart Analysis: Analyzes code context to suggest more descriptive names
  • ๐Ÿค– AI Integration: Support for multiple AI providers (OpenAI, Anthropic, Google Gemini, Ollama, GitHub Copilot)
  • ๐Ÿ“‹ Predefined Rules: Rule-based system following naming best practices
  • โš›๏ธ React Support: Specific suggestions for React components, hooks, and events
  • ๐Ÿงช Context Detection: Recognizes different file types (testing, API, components, etc.)
  • ๐Ÿ“Š Detailed Statistics: Complete analysis reports
  • ๐ŸŽฏ Easy to Use: Interactive and user-friendly CLI interface
  • ๐Ÿ—๏ธ Clean Architecture: Built with SOLID principles and clean code practices

๐Ÿ“ฆ Installation

Global Installation

With npm:

npm install -g project-code-namer

With yarn:

yarn global add project-code-namer

Project-specific Installation

With npm:

npm install --save-dev project-code-namer

With yarn:

yarn add --dev project-code-namer

๐Ÿค– AI Setup (Optional but Recommended)

Option 1: Ollama - Free Local AI (Recommended)

Ollama allows you to run AI models locally for free, ensuring privacy and no API costs.

Step 1: Install Ollama

macOS:

# Install Ollama using Homebrew
brew install ollama

# Start the Ollama service
brew services start ollama

Windows:

  1. Download from https://ollama.ai/download/windows
  2. Run the installer
  3. Or use winget:
    winget install Ollama.Ollama

Linux:

# Install using the official script
curl -fsSL https://ollama.ai/install.sh | sh

# Start the service
sudo systemctl start ollama
sudo systemctl enable ollama

Step 2: Download a Model

After installing Ollama, download a lightweight model:

# Download a small, fast model (1.3GB) - Recommended
ollama pull llama3.2:1b

# Or a more capable model (4.7GB) - Better quality
ollama pull llama3.2:3b

# Verify downloaded models
ollama list

Step 3: Verify Installation

Test that everything is working:

# Test the model
ollama run llama3.2:1b "Hello, can you suggest better names for a variable called 'data'?"

Step 4: Create Configuration File

In your project root, create a file called .ai-config.json:

{
  "provider": "ollama",
  "ollama": {
    "endpoint": "http://localhost:11434/api/generate",
    "model": "llama3.2:1b"
  }
}

Option 2: OpenAI (Paid)

If you prefer OpenAI's models:

  1. Get your API key from OpenAI Platform
  2. Create a .ai-config.json file in your project root:
    {
      "provider": "openai",
      "openai": {
        "apiKey": "your_actual_api_key_here",
        "model": "gpt-3.5-turbo"
      }
    }
  3. Or use environment variables by creating a .env file:
    OPENAI_API_KEY=your_actual_api_key_here

Auto-Setup for Ollama

Run the automatic setup script to configure Ollama:

With npm:

npm run setup-ollama

With yarn:

yarn setup-ollama

๐ŸŽฏ Usage

Command Line

With npm:

npx project-code-namer

With yarn:

yarn project-code-namer

If installed globally:

project-code-namer

Programmatic Usage

import { NamerSuggesterApp, CodeAnalyzer, SuggestionService } from 'project-code-namer';

// Use the complete application
const app = new NamerSuggesterApp();
await app.run();

// Or use individual components
const result = CodeAnalyzer.analyzeFile('./src/example.ts');

const suggestionService = new SuggestionService(config);
const suggestions = await suggestionService.getSuggestions(
  'data',
  'variable',
  '',
  fileContext
);

โš™๏ธ Configuration

Quick Start Configuration

After installation, you need to configure an AI provider. The easiest way is:

  1. Run the setup script (automatically creates .ai-config.json):

    # With npm
    npm run setup-ollama
    
    # With yarn
    yarn setup-ollama
  2. Or manually create .ai-config.json in your project root:

AI Configuration Options

Create a .ai-config.json file in your project root with one of these configurations:

Option 1: Ollama (Free Local AI - Recommended)

{
  "provider": "ollama",
  "ollama": {
    "endpoint": "http://localhost:11434/api/generate",
    "model": "llama3.2:1b"
  }
}

Option 2: OpenAI (Paid)

{
  "provider": "openai",
  "openai": {
    "apiKey": "your-api-key-here",
    "model": "gpt-3.5-turbo"
  }
}

Option 3: Mixed Setup (Auto-fallback)

{
  "provider": "auto",
  "ollama": {
    "endpoint": "http://localhost:11434/api/generate",
    "model": "llama3.2:1b"
  },
  "openai": {
    "apiKey": "your-api-key-here",
    "model": "gpt-3.5-turbo"
  }
}

Option 4: Rules Only (No AI)

{
  "provider": "rules"
}

๐Ÿ’ก Tip: The setup script (yarn setup-ollama) automatically creates the Ollama configuration for you!

Supported Providers

  • ๐Ÿ†“ ollama: Free local models (Recommended)
  • ๐Ÿ†“ rules: Predefined naming rules only
  • ๐Ÿ’ฐ openai: OpenAI GPT models (Paid)
  • ๐Ÿ’ฐ anthropic: Anthropic Claude (Paid)
  • ๐Ÿ’ฐ gemini: Google Gemini (Paid)
  • ๐Ÿ”„ auto: Tries all available providers
  • ๐Ÿ™ copilot: GitHub Copilot CLI

๐Ÿ—๏ธ Architecture

The project follows Clean Code and SOLID principles:

src/
โ”œโ”€โ”€ analyzers/          # Code analysis and context extraction
โ”œโ”€โ”€ cli/               # Command line interface
โ”œโ”€โ”€ config/            # Configuration management
โ”œโ”€โ”€ providers/         # AI providers
โ”œโ”€โ”€ services/          # Main business logic
โ”œโ”€โ”€ types/            # TypeScript type definitions
โ”œโ”€โ”€ utils/            # General utilities
โ”œโ”€โ”€ bin/              # Executables
โ””โ”€โ”€ app.ts            # Main application

Applied Principles

  • SRP (Single Responsibility Principle): Each class has a specific responsibility
  • DRY (Don't Repeat Yourself): Reusable code without duplication
  • Clean Code: Descriptive names, small functions, clear structure
  • Separation of Concerns: Clear separation between analysis, suggestions, CLI, and configuration

๐Ÿ’ก Types of Suggestions

Functions

  • Event handlers (handle*, on*)
  • API functions (fetch*, load*, retrieve*)
  • Validators (validate*, is*Valid)
  • Initializers (initialize*, setup*, create*)

Variables

  • States and flags (is*, has*, should*)
  • Data and collections (items, collection, payload)
  • Counters and indices (counter, index, position)

React Specific

  • Components (*Component)
  • Custom hooks (use*)
  • Event handlers (handle*, on*)
  • States (*State)

๐Ÿ“Š Example

Before

function getData() {
  const d = fetch('/api/users');
  return d;
}

const flag = true;
const arr = [1, 2, 3];

After (suggestions)

function fetchUserData() {
  const userData = fetch('/api/users');
  return userData;
}

const isVisible = true;
const userItems = [1, 2, 3];

๐ŸŽฎ Interactive Demo

When you run project-code-namer, you'll see an interactive menu:

๐Ÿ” Namer Suggester - Name Analyzer
---------------------------------------
๐Ÿงฐ Project detected: TYPESCRIPT
๐Ÿ› ๏ธ Framework: NEXTJS
๐Ÿค– Suggestion engine: Automatic (tries all available)

๐Ÿ”ง What would you like to do?
โฏ ๐Ÿ“‚ Analyze files and get naming suggestions
  โš™๏ธ Configure AI providers
  โ“ View help
  โŒ Exit

๐Ÿ“ˆ Benefits

  • Improved Code Readability: More descriptive and meaningful names
  • Consistency: Follows established naming conventions
  • Team Alignment: Standardized naming across the team
  • Learning Tool: Helps developers learn better naming practices
  • Time Saving: Automated suggestions instead of manual thinking
  • Context Awareness: Suggestions based on code context and purpose

๐Ÿ”ง Development

Prerequisites

  • Node.js 16+
  • TypeScript 4.5+

Build

With npm:

npm run build

With yarn:

yarn build

Project Structure

The codebase is organized into specialized modules:

  • analyzers/: AST parsing and context extraction
  • cli/: Interactive user interface
  • config/: Configuration management
  • providers/: AI service integrations
  • services/: Core business logic
  • types/: TypeScript definitions
  • utils/: Shared utilities

๐Ÿค Contributing

  1. Fork the project
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Guidelines

  • Follow TypeScript best practices
  • Maintain test coverage
  • Update documentation
  • Follow existing code style

โ“ FAQ

Which AI provider should I use?

For most users, we recommend Ollama:

  • โœ… Free: No API costs
  • โœ… Private: Your code never leaves your machine
  • โœ… Fast: No network latency
  • โœ… Reliable: No rate limits or quotas

Use OpenAI if:

  • You already have credits
  • You need the absolute best quality suggestions
  • You don't mind the API costs

Does this work offline?

With Ollama: Yes! Once you download a model, everything works completely offline.

With OpenAI: No. Requires internet connection for API calls.

How much disk space does Ollama use?

  • llama3.2:1b: ~1.3GB (recommended for fast suggestions)
  • llama3.2:3b: ~4.7GB (better quality, slower)

Is my code sent to external servers?

With Ollama: No. Everything stays on your local machine.

With OpenAI: Yes. Code snippets are sent to OpenAI's servers for analysis.

Can I use this in CI/CD?

Yes! Use the rules provider for CI/CD environments:

{
  "provider": "rules"
}

This uses only predefined rules without requiring AI models.

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Inspired by code naming best practices
  • Uses Babel for AST analysis
  • Integration with multiple AI providers for advanced suggestions
  • Built with modern TypeScript and Node.js

๐Ÿ“ž Support

๐Ÿ”ฎ Roadmap

  • Unit and integration tests
  • VS Code extension
  • More AI providers
  • Custom rule configuration
  • Batch processing mode
  • CI/CD integration
  • Support for more languages

Made by Juan David Peรฑa

Helping developers write better, more readable code, one name at a time.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published