A Model Context Protocol (MCP) server that provides a "reflect" tool for AI assistants to create cognitive checkpoints and structured reasoning. This tool helps LLMs maintain context, reflect on their work, and think through complex problems systematically.
Key Insight: Without explicitly outputting thought processes, no deep thinking occurs. This tool creates mandatory cognitive checkpoints that prevent shortcuts and improve accuracy.
- π§ Structured Reasoning: Forces AI assistants to reflect step-by-step through complex problems
- β Task Validation: Creates checkpoints to verify requirements are met
- π Learning Documentation: Captures discoveries and insights during problem-solving
- π Debugging Aid: Helps work through issues systematically by elimination
- π― Decision Audit Trail: Creates a record of reasoning for important decisions
# Install globally
npm install -g mcp-reflection-tool
# Or run directly with npx (no installation needed)
npx mcp-reflection-tool
Add the server using a single command:
mcp add npx mcp-reflection-tool
This will automatically configure the server in your Claude Code settings. After running the command, restart Claude Code completely.
Add to your Cursor configuration:
Option 1: Via Settings UI
- Open Cursor Settings (Cmd/Ctrl + ,)
- Search for "MCP" or navigate to Features > MCP
- Add the reflection tool configuration
Option 2: Direct Config Edit
Edit ~/.cursor/mcp_config.json
:
{
"mcpServers": {
"reflection-tool": {
"command": "npx",
"args": ["mcp-reflection-tool"]
}
}
}
Restart Cursor after making changes.
Add to your Windsurf MCP configuration:
Location: ~/.windsurf/mcp.json
(macOS/Linux) or %USERPROFILE%\.windsurf\mcp.json
(Windows)
{
"mcpServers": {
"reflection-tool": {
"command": "npx",
"args": ["mcp-reflection-tool"]
}
}
}
Restart Windsurf to apply changes.
Option 1: Via VS Code Settings UI
- Open VS Code Settings (Cmd/Ctrl + ,)
- Search for "Cline MCP"
- Add server configuration
Option 2: Edit settings.json
Add to your VS Code settings.json
:
{
"cline.mcpServers": {
"reflection-tool": {
"command": "npx",
"args": ["mcp-reflection-tool"]
}
}
}
Reload VS Code window after configuration.
The server runs via stdio by default. If you need HTTP mode, use environment variables:
# Start server in stdio mode (default)
npx mcp-reflection-tool
# Start in HTTP mode on port 8080
HTTP=true npx mcp-reflection-tool
# Start in HTTP mode on custom port
HTTP=true PORT=3000 npx mcp-reflection-tool
Most modern AI tools support stdio mode automatically.
Once installed, the AI assistant will have access to the reflect
tool. Here are examples of how it gets used:
Using reflection tool: "Breaking down authentication implementation:
1. Check existing auth patterns in codebase
2. Set up JWT token generation
3. Add middleware for route protection
4. Test with valid and expired tokens"
Using reflection tool: "Task completion check:
- Completed: Implemented user authentication with JWT
- Learned: Existing middleware made integration smooth
- Technical debt: Need to add rate limiting
- Next: Update API documentation"
Using reflection tool: "Debugging slow API responses:
- Symptom: 5+ second response times
- Hypothesis 1: Missing database indexes - CONFIRMED
- Hypothesis 2: N+1 query problem - Also found
- Solution: Added compound indexes and query batching
- Result: Response time now <200ms"
The AI assistant will automatically use this tool as a cognitive scratchpad for:
- π Chain-of-thought reasoning through complex problems
- π Planning your approach before taking actions
- β Reflecting on outcomes after completing tasks
- βοΈ Validating that requirements are met
- π Documenting discoveries and learnings
- π― Creating cognitive checkpoints you can't skip
This helps the AI think step-by-step, improving accuracy and compliance.
- Node.js 18+ or Bun runtime
- npm or bun package manager
# Clone the repository
git clone https://github.com/sterling/think-tool.git
cd think-tool
# Install dependencies
bun install
# or
npm install
# Run in development mode (with hot-reload)
bun run dev
# or
npm run dev
# Build TypeScript to JavaScript
bun run build
# or
npm run build
# Run the built version
bun start
# or
npm start
βββ src/
β βββ server.ts # TypeScript source code
βββ dist/ # Built JavaScript (generated)
β βββ server.js # Main server file
β βββ cli.js # CLI executable
βββ package.json # NPM package configuration
βββ tsconfig.json # TypeScript configuration
PORT
: Server port (default: 8080)PORT=3000 npx mcp-reflection-tool
The reflection tool implements the Model Context Protocol (MCP) to provide a standardized way for AI assistants to access external tools. When an AI assistant needs to reflect on a problem:
- The AI calls the
reflect
tool with its reasoning - The tool logs the thought process to the server console
- The tool acknowledges the checkpoint back to the AI
- This creates a cognitive checkpoint that improves reasoning quality
This "thinking out loud" effect has been shown to significantly improve the accuracy and completeness of AI responses.
Port Already in Use
# Use a different port
PORT=8081 npx mcp-reflection-tool
Permission Denied
# Reinstall globally with proper permissions
sudo npm install -g mcp-reflection-tool
Tool Not Available in AI Assistant
- Ensure the MCP server is running
- Restart your AI tool after adding configuration
- Check for valid JSON syntax in config files
- Verify the config file location for your OS
- Server logs: Visible in terminal where server is running
- Claude Code logs:
~/Library/Logs/Claude/mcp*.log
(macOS) - VS Code logs: View > Output > Select "Cline" from dropdown
- Cursor logs: Help > Toggle Developer Tools > Console
# Check if package is installed globally
npm list -g mcp-reflection-tool
# Test the server directly
npx mcp-reflection-tool
# Test stdio mode
echo '{"jsonrpc":"2.0","method":"initialize","id":1}' | npx mcp-reflection-tool
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
MIT License - see the LICENSE file for details.
Created as an MCP implementation for enhancing AI reasoning capabilities.
- Built with FastMCP framework
- Implements the Model Context Protocol standard
- Inspired by research on structured reasoning for AI systems