Skip to content

pinkpixel-dev/context-generator-mcp

Repository files navigation

πŸš€ context-generator MCP Server

smithery badge

An MCP (Model Context Protocol) server that scrapes documentation websites and generates context files. Built with proven x-crawl patterns for reliable web scraping.

✨ Features

  • πŸ•·οΈ Smart Documentation Crawling - Uses x-crawl with documentation-specific enhancements
  • 🧠 Platform Detection - Automatically detects GitBook, Docusaurus, VuePress, and other platforms
  • πŸ“ Clean Content Extraction - Removes navigation, ads, and preserves formatting
  • πŸ“‹ context Generation - Creates both llms.txt and llms-full.txt formats
  • ⚑ MCP Integration - Works seamlessly with any MCP client
  • πŸ¦™ Local AI Processing - Full Ollama integration for privacy-focused AI features
  • πŸ’Ύ Robust File Operations - Enterprise-grade file writing with comprehensive error handling
  • πŸ”§ Enhanced Tool Descriptions - LLM-optimized tool schemas with detailed usage guidance

πŸ› οΈ MCP Tools

scrape_documentation

Scrape an entire documentation website and extract content.

{
  "url": "https://docs.example.com",
  "options": {
    "maxPages": 50,
    "maxDepth": 3,
    "outputFormat": "both",
    "delayMs": 1000
  }
}

preview_page

Preview content extraction from a single page.

{
  "url": "https://docs.example.com/getting-started"
}

detect_platform

Detect the documentation platform type for optimization.

{
  "url": "https://docs.example.com"
}

generate_context

Generate context format from crawled content.

{
  "crawlResults": [...],
  "options": {
    "format": "full",
    "includeSourceUrls": true
  }
}

πŸ”§ Installation

Installing via Smithery

To install context-generator-mcp for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @pinkpixel-dev/context-generator-mcp --client claude

Manual Installation

npm install
npm run build

πŸƒβ€β™‚οΈ Running

npm start

πŸ§ͺ Development

npm run dev

πŸ“– Usage with MCP Clients

Basic Configuration

Add to your MCP client configuration:

{
  "context-generator": {
    "command": "node",
    "args": ["/path/to/context-generator-server/dist/index.js"]
  }
}

πŸ€– AI Integration (Optional)

For enhanced crawling and content processing, you can configure AI providers.

πŸ¦™ Ollama (Recommended for Local Use)

Benefits:

  • πŸ”’ Privacy: All data stays on your machine
  • πŸ’° Cost-effective: No API fees
  • ⚑ Fast: Local processing
  • 🌐 Offline: Works without internet

Quick Setup:

# 1. Install Ollama: https://ollama.com/download
# 2. Pull a model
ollama pull llama3.1

# 3. Configure environment
echo "OLLAMA_MODEL=llama3.1" >> .env

# 4. Test the integration
npm run test:ollama

MCP Configuration:

{
  "context-generator": {
    "command": "node",
    "args": ["/path/to/context-generator-server/dist/index.js"],
    "env": {
      "OLLAMA_MODEL": "llama3.1"
    }
  }
}

πŸ”‘ OpenAI (Cloud-based)

Benefits:

  • πŸš€ Powerful: Latest GPT models
  • ☁️ No setup: Cloud-based processing
  • πŸ”§ Maintenance-free: Always updated

Setup:

{
  "context-generator": {
    "command": "node",
    "args": ["/path/to/context-generator-server/dist/index.js"],
    "env": {
      "OPENAI_API_KEY": "sk-your-openai-key",
      "OPENAI_MODEL": "gpt-4"
    }
  }
}

πŸ”— Mixed Configuration

{
  "context-generator": {
    "command": "node",
    "args": ["/path/to/context-generator-server/dist/index.js"],
    "env": {
      "OPENAI_API_KEY": "sk-your-openai-key",
      "OPENAI_MODEL": "gpt-4",
      "OLLAMA_MODEL": "llama3.1",
      "OLLAMA_BASE_URL": "http://localhost:11434"
    }
  }
}

Environment Variables Reference

Variable Description Default Required
OpenAI
OPENAI_API_KEY OpenAI API key for AI-assisted crawling - βœ… (for OpenAI)
OPENAI_MODEL OpenAI model to use gpt-3.5-turbo ❌
Ollama
OLLAMA_MODEL Ollama model name (e.g., llama3.1, codellama) llama3.1 βœ… (for Ollama)
OLLAMA_BASE_URL Custom Ollama server URL http://localhost:11434 ❌
OLLAMA_API_KEY API key for hosted Ollama instances - ❌

πŸ“– Detailed Setup: See OLLAMA_SETUP.md for complete installation and configuration instructions.

πŸ§ͺ Testing: Use npm run test:ollama to validate your Ollama setup.

⚠️ Note: AI integration is optional. The server works without these variables, but AI-enhanced features won't be available.

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •