diff --git a/FAQ.md b/FAQ.md
index e4390db..69384ca 100644
--- a/FAQ.md
+++ b/FAQ.md
@@ -255,6 +255,17 @@ Absolutely. While it excels at coding-related tasks, Claude Desktop Commander ca
- Running and managing any terminal-based tools
- Data processing and analysis
+### Can I read files from the end like Unix tail?
+
+Yes! Recent updates added negative offset support:
+
+```javascript
+// Read last 10 lines
+read_file({ path: "server.log", offset: -10 })
+```
+
+This is useful for checking recent log entries or file endings without reading the entire content.
+
### Can I use Desktop Commander in any MCP client outside of Claude?
Yes, you can install Desktop Commander MCP on other clients that support MCP, including:
diff --git a/README.md b/README.md
index db3e8b1..ce0a865 100644
--- a/README.md
+++ b/README.md
@@ -52,6 +52,7 @@ Execute long-running terminal commands on your computer and manage processes thr
- Move files/directories
- Search files
- Get file metadata
+ - **Negative offset file reading**: Read from end of files using negative offset values (like Unix tail)
- Code editing capabilities:
- Surgical text replacements for small changes
- Full file rewrites for major changes
@@ -187,7 +188,7 @@ The server provides a comprehensive set of tools organized into several categori
| | `list_sessions` | List all active terminal sessions |
| | `list_processes` | List all running processes with detailed information |
| | `kill_process` | Terminate a running process by PID |
-| **Filesystem** | `read_file` | Read contents from local filesystem or URLs with line-based pagination (supports offset and length parameters) |
+| **Filesystem** | `read_file` | Read contents from local filesystem or URLs with line-based pagination (supports positive/negative offset and length parameters) |
| | `read_multiple_files` | Read multiple files simultaneously |
| | `write_file` | Write file contents with options for rewrite or append mode (uses configurable line limits) |
| | `create_directory` | Create a new directory or ensure it exists |
diff --git a/docs/index.html b/docs/index.html
index fd60fdb..8e1619b 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -1008,7 +1008,7 @@
Subscription-based usage
Smart file system integration
-
Claude understands your project structure with intelligent file search and can make precise surgical changes to your codebase.
+
Claude understands your project structure with intelligent file search and can make precise surgical changes to your codebase. Now supports reading from file end with negative offsets.
diff --git a/src/handlers/filesystem-handlers.ts b/src/handlers/filesystem-handlers.ts
index 60612a6..0efb402 100644
--- a/src/handlers/filesystem-handlers.ts
+++ b/src/handlers/filesystem-handlers.ts
@@ -167,11 +167,9 @@ export async function handleWriteFile(args: unknown): Promise {
const lineCount = lines.length;
let errorMessage = "";
if (lineCount > MAX_LINES) {
- errorMessage = `File was written with warning: Line count limit exceeded: ${lineCount} lines (maximum: ${MAX_LINES}).
+ errorMessage = `โ File written successfully! (${lineCount} lines)
-SOLUTION: Split your content into smaller chunks:
-1. First chunk: write_file(path, firstChunk, {mode: 'rewrite'})
-2. Additional chunks: write_file(path, nextChunk, {mode: 'append'})`;
+๐ก Performance tip: For optimal speed, consider chunking files into โค30 line pieces in future operations.`;
}
// Pass the mode parameter to writeFile
diff --git a/src/server.ts b/src/server.ts
index fbb3346..4b1aba2 100644
--- a/src/server.ts
+++ b/src/server.ts
@@ -126,7 +126,22 @@ server.setRequestHandler(ListToolsRequestSchema, async () => {
Supports partial file reading with:
- 'offset' (start line, default: 0)
+ * Positive: Start from line N (0-based indexing)
+ * Negative: Read last N lines from end (tail behavior)
- 'length' (max lines to read, default: configurable via 'fileReadLineLimit' setting, initially 1000)
+ * Used with positive offsets for range reading
+ * Ignored when offset is negative (reads all requested tail lines)
+
+ Examples:
+ - offset: 0, length: 10 โ First 10 lines
+ - offset: 100, length: 5 โ Lines 100-104
+ - offset: -20 โ Last 20 lines
+ - offset: -5, length: 10 โ Last 5 lines (length ignored)
+
+ Performance optimizations:
+ - Large files with negative offsets use reverse reading for efficiency
+ - Large files with deep positive offsets use byte estimation
+ - Small files use fast readline streaming
When reading from the file system, only works within allowed directories.
Can fetch content from URLs when isUrl parameter is set to true
@@ -158,30 +173,30 @@ server.setRequestHandler(ListToolsRequestSchema, async () => {
{
name: "write_file",
description: `
- Write or append to file contents with a configurable line limit per call (default: 50 lines).
- THIS IS A STRICT REQUIREMENT. ANY file with more than the configured limit MUST BE written in chunks or IT WILL FAIL.
-
- โ ๏ธ IMPORTANT: PREVENTATIVE CHUNKING REQUIRED in these scenarios:
- 1. When content exceeds 2,000 words or 30 lines
- 2. When writing MULTIPLE files one after another (each next file is more likely to be truncated)
- 3. When the file is the LAST ONE in a series of operations in the same message
-
- ALWAYS split files writes in to multiple smaller writes PREEMPTIVELY without asking the user in these scenarios.
-
- REQUIRED PROCESS FOR LARGE NEW FILE WRITES OR REWRITES:
- 1. FIRST โ write_file(filePath, firstChunk, {mode: 'rewrite'})
- 2. THEN โ write_file(filePath, secondChunk, {mode: 'append'})
- 3. THEN โ write_file(filePath, thirdChunk, {mode: 'append'})
- ... and so on for each chunk
-
- HANDLING TRUNCATION ("Continue" prompts):
- If user asked to "Continue" after unfinished file write:
- 1. First, read the file to find out what content was successfully written
- 2. Identify exactly where the content was truncated
- 3. Continue writing ONLY the remaining content using {mode: 'append'}
- 4. Split the remaining content into smaller chunks (15-20 lines per chunk)
-
- Files over the line limit (configurable via 'fileWriteLineLimit' setting) WILL BE REJECTED if not broken into chunks as described above.
+ Write or append to file contents.
+
+ ๐ฏ CHUNKING IS STANDARD PRACTICE: Always write files in chunks of 25-30 lines maximum.
+ This is the normal, recommended way to write files - not an emergency measure.
+
+ STANDARD PROCESS FOR ANY FILE:
+ 1. FIRST โ write_file(filePath, firstChunk, {mode: 'rewrite'}) [โค30 lines]
+ 2. THEN โ write_file(filePath, secondChunk, {mode: 'append'}) [โค30 lines]
+ 3. CONTINUE โ write_file(filePath, nextChunk, {mode: 'append'}) [โค30 lines]
+
+ โ ๏ธ ALWAYS CHUNK PROACTIVELY - don't wait for performance warnings!
+
+ WHEN TO CHUNK (always be proactive):
+ 1. Any file expected to be longer than 25-30 lines
+ 2. When writing multiple files in sequence
+ 3. When creating documentation, code files, or configuration files
+
+ HANDLING CONTINUATION ("Continue" prompts):
+ If user asks to "Continue" after an incomplete operation:
+ 1. Read the file to see what was successfully written
+ 2. Continue writing ONLY the remaining content using {mode: 'append'}
+ 3. Keep chunks to 25-30 lines each
+
+ Files over 50 lines will generate performance notes but are still written successfully.
Only works within allowed directories.
${PATH_GUIDANCE}
diff --git a/src/tools/edit.ts b/src/tools/edit.ts
index 175ce8b..f1f9def 100644
--- a/src/tools/edit.ts
+++ b/src/tools/edit.ts
@@ -1,4 +1,4 @@
-import { readFile, writeFile } from './filesystem.js';
+import { readFile, writeFile, readFileInternal } from './filesystem.js';
import { ServerResult } from '../types.js';
import { recursiveFuzzyIndexOf, getSimilarityRatio } from './fuzzySearch.js';
import { capture } from '../utils/capture.js';
@@ -119,8 +119,8 @@ export async function performSearchReplace(filePath: string, block: SearchReplac
}
- // Read file as plain string
- const {content} = await readFile(filePath, false, 0, Number.MAX_SAFE_INTEGER);
+ // Read file as plain string without status messages
+ const content = await readFileInternal(filePath, 0, Number.MAX_SAFE_INTEGER);
// Make sure content is a string
if (typeof content !== 'string') {
diff --git a/src/tools/filesystem.ts b/src/tools/filesystem.ts
index 7b71e84..9cf50de 100644
--- a/src/tools/filesystem.ts
+++ b/src/tools/filesystem.ts
@@ -2,6 +2,8 @@ import fs from "fs/promises";
import path from "path";
import os from 'os';
import fetch from 'cross-fetch';
+import { createReadStream } from 'fs';
+import { createInterface } from 'readline';
import {capture} from '../utils/capture.js';
import {withTimeout} from '../utils/withTimeout.js';
import {configManager} from '../config-manager.js';
@@ -247,6 +249,243 @@ export async function readFileFromUrl(url: string): Promise {
}
}
+/**
+ * Read file content using smart positioning for optimal performance
+ * @param filePath Path to the file (already validated)
+ * @param offset Starting line number (negative for tail behavior)
+ * @param length Maximum number of lines to read
+ * @param mimeType MIME type of the file
+ * @param includeStatusMessage Whether to include status headers (default: true)
+ * @returns File result with content
+ */
+async function readFileWithSmartPositioning(filePath: string, offset: number, length: number, mimeType: string, includeStatusMessage: boolean = true): Promise {
+ const stats = await fs.stat(filePath);
+ const fileSize = stats.size;
+ const LARGE_FILE_THRESHOLD = 10 * 1024 * 1024; // 10MB threshold
+ const SMALL_READ_THRESHOLD = 100; // For very small reads, use efficient methods
+
+ // For negative offsets (tail behavior), use reverse reading
+ if (offset < 0) {
+ const requestedLines = Math.abs(offset);
+
+ if (fileSize > LARGE_FILE_THRESHOLD && requestedLines <= SMALL_READ_THRESHOLD) {
+ // Use efficient reverse reading for large files with small tail requests
+ return await readLastNLinesReverse(filePath, requestedLines, mimeType, includeStatusMessage);
+ } else {
+ // Use readline circular buffer for other cases
+ return await readFromEndWithReadline(filePath, requestedLines, mimeType, includeStatusMessage);
+ }
+ }
+
+ // For positive offsets
+ else {
+ // For small files or reading from start, use simple readline
+ if (fileSize < LARGE_FILE_THRESHOLD || offset === 0) {
+ return await readFromStartWithReadline(filePath, offset, length, mimeType, includeStatusMessage);
+ }
+
+ // For large files with middle/end reads, try to estimate position
+ else {
+ // If seeking deep into file, try byte estimation
+ if (offset > 1000) {
+ return await readFromEstimatedPosition(filePath, offset, length, mimeType, includeStatusMessage);
+ } else {
+ return await readFromStartWithReadline(filePath, offset, length, mimeType, includeStatusMessage);
+ }
+ }
+ }
+}
+
+/**
+ * Read last N lines efficiently by reading file backwards in chunks
+ */
+async function readLastNLinesReverse(filePath: string, n: number, mimeType: string, includeStatusMessage: boolean = true): Promise {
+ const fd = await fs.open(filePath, 'r');
+ try {
+ const stats = await fd.stat();
+ const fileSize = stats.size;
+
+ const chunkSize = 8192; // 8KB chunks
+ let position = fileSize;
+ let lines: string[] = [];
+ let partialLine = '';
+
+ while (position > 0 && lines.length < n) {
+ const readSize = Math.min(chunkSize, position);
+ position -= readSize;
+
+ const buffer = Buffer.alloc(readSize);
+ await fd.read(buffer, 0, readSize, position);
+
+ const chunk = buffer.toString('utf-8');
+ const text = chunk + partialLine;
+ const chunkLines = text.split('\n');
+
+ partialLine = chunkLines.shift() || '';
+ lines = chunkLines.concat(lines);
+ }
+
+ // Add the remaining partial line if we reached the beginning
+ if (position === 0 && partialLine) {
+ lines.unshift(partialLine);
+ }
+
+ const result = lines.slice(-n); // Get exactly n lines
+ const content = includeStatusMessage
+ ? `[Reading last ${result.length} lines]\n\n${result.join('\n')}`
+ : result.join('\n');
+
+ return { content, mimeType, isImage: false };
+ } finally {
+ await fd.close();
+ }
+}
+
+/**
+ * Read from end using readline with circular buffer
+ */
+async function readFromEndWithReadline(filePath: string, requestedLines: number, mimeType: string, includeStatusMessage: boolean = true): Promise {
+ const rl = createInterface({
+ input: createReadStream(filePath),
+ crlfDelay: Infinity
+ });
+
+ const buffer: string[] = new Array(requestedLines);
+ let bufferIndex = 0;
+ let totalLines = 0;
+
+ for await (const line of rl) {
+ buffer[bufferIndex] = line;
+ bufferIndex = (bufferIndex + 1) % requestedLines;
+ totalLines++;
+ }
+
+ rl.close();
+
+ // Extract lines in correct order
+ let result: string[];
+ if (totalLines >= requestedLines) {
+ result = [
+ ...buffer.slice(bufferIndex),
+ ...buffer.slice(0, bufferIndex)
+ ].filter(line => line !== undefined);
+ } else {
+ result = buffer.slice(0, totalLines);
+ }
+
+ const content = includeStatusMessage
+ ? `[Reading last ${result.length} lines]\n\n${result.join('\n')}`
+ : result.join('\n');
+ return { content, mimeType, isImage: false };
+}
+
+/**
+ * Read from start/middle using readline
+ */
+async function readFromStartWithReadline(filePath: string, offset: number, length: number, mimeType: string, includeStatusMessage: boolean = true): Promise {
+ const rl = createInterface({
+ input: createReadStream(filePath),
+ crlfDelay: Infinity
+ });
+
+ const result: string[] = [];
+ let lineNumber = 0;
+
+ for await (const line of rl) {
+ if (lineNumber >= offset && result.length < length) {
+ result.push(line);
+ }
+ if (result.length >= length) break; // Early exit optimization
+ lineNumber++;
+ }
+
+ rl.close();
+
+ if (includeStatusMessage) {
+ const statusMessage = offset === 0
+ ? `[Reading ${result.length} lines from start]`
+ : `[Reading ${result.length} lines from line ${offset}]`;
+ const content = `${statusMessage}\n\n${result.join('\n')}`;
+ return { content, mimeType, isImage: false };
+ } else {
+ const content = result.join('\n');
+ return { content, mimeType, isImage: false };
+ }
+}
+
+/**
+ * Read from estimated byte position for very large files
+ */
+async function readFromEstimatedPosition(filePath: string, offset: number, length: number, mimeType: string, includeStatusMessage: boolean = true): Promise {
+ // First, do a quick scan to estimate lines per byte
+ const rl = createInterface({
+ input: createReadStream(filePath),
+ crlfDelay: Infinity
+ });
+
+ let sampleLines = 0;
+ let bytesRead = 0;
+ const SAMPLE_SIZE = 10000; // Sample first 10KB
+
+ for await (const line of rl) {
+ bytesRead += Buffer.byteLength(line, 'utf-8') + 1; // +1 for newline
+ sampleLines++;
+ if (bytesRead >= SAMPLE_SIZE) break;
+ }
+
+ rl.close();
+
+ if (sampleLines === 0) {
+ // Fallback to simple read
+ return await readFromStartWithReadline(filePath, offset, length, mimeType, includeStatusMessage);
+ }
+
+ // Estimate average line length and seek position
+ const avgLineLength = bytesRead / sampleLines;
+ const estimatedBytePosition = Math.floor(offset * avgLineLength);
+
+ // Create a new stream starting from estimated position
+ const fd = await fs.open(filePath, 'r');
+ try {
+ const stats = await fd.stat();
+ const startPosition = Math.min(estimatedBytePosition, stats.size);
+
+ const stream = createReadStream(filePath, { start: startPosition });
+ const rl2 = createInterface({
+ input: stream,
+ crlfDelay: Infinity
+ });
+
+ const result: string[] = [];
+ let lineCount = 0;
+ let firstLineSkipped = false;
+
+ for await (const line of rl2) {
+ // Skip first potentially partial line if we didn't start at beginning
+ if (!firstLineSkipped && startPosition > 0) {
+ firstLineSkipped = true;
+ continue;
+ }
+
+ if (result.length < length) {
+ result.push(line);
+ } else {
+ break;
+ }
+ lineCount++;
+ }
+
+ rl2.close();
+
+ const content = includeStatusMessage
+ ? `[Reading ${result.length} lines from estimated position (target line ${offset})]\n\n${result.join('\n')}`
+ : result.join('\n');
+ return { content, mimeType, isImage: false };
+ } finally {
+ await fd.close();
+ }
+}
+
/**
* Read file content from the local filesystem
* @param filePath Path to the file
@@ -308,49 +547,9 @@ export async function readFileFromDisk(filePath: string, offset: number = 0, len
return { content, mimeType, isImage };
} else {
- // For all other files, try to read as UTF-8 text with line-based offset and length
+ // For all other files, use smart positioning approach
try {
- // Read the entire file first
- const buffer = await fs.readFile(validPath);
- const fullContent = buffer.toString('utf-8');
-
- // Split into lines for line-based access
- const lines = fullContent.split('\n');
- const totalLines = lines.length;
-
- // Apply line-based offset and length - handle beyond-file-size scenario
- let startLine = Math.min(offset, totalLines);
- let endLine = Math.min(startLine + length, totalLines);
-
- // If startLine equals totalLines (reading beyond end), adjust to show some content
- // Only do this if we're not trying to read the whole file
- if (startLine === totalLines && offset > 0 && length < Number.MAX_SAFE_INTEGER) {
- // Show last few lines instead of nothing
- const lastLinesCount = Math.min(10, totalLines); // Show last 10 lines or fewer if file is smaller
- startLine = Math.max(0, totalLines - lastLinesCount);
- endLine = totalLines;
- }
-
- const selectedLines = lines.slice(startLine, endLine);
- const truncatedContent = selectedLines.join('\n');
-
- // Add an informational message if truncated or adjusted
- let content = truncatedContent;
-
- // Only add informational message for normal reads (not when reading entire file)
- const isEntireFileRead = offset === 0 && length >= Number.MAX_SAFE_INTEGER;
-
- if (!isEntireFileRead) {
- if (offset >= totalLines && totalLines > 0) {
- // Reading beyond end of file case
- content = `[NOTICE: Offset ${offset} exceeds file length (${totalLines} lines). Showing last ${endLine - startLine} lines instead.]\n\n${truncatedContent}`;
- } else if (offset > 0 || endLine < totalLines) {
- // Normal partial read case
- content = `[Reading ${endLine - startLine} lines from line ${startLine} of ${totalLines} total lines]\n\n${truncatedContent}`;
- }
- }
-
- return { content, mimeType, isImage };
+ return await readFileWithSmartPositioning(validPath, offset, length, mimeType, true);
} catch (error) {
// If UTF-8 reading fails, treat as binary and return base64 but still as text
const buffer = await fs.readFile(validPath);
@@ -389,6 +588,98 @@ export async function readFile(filePath: string, isUrl?: boolean, offset?: numbe
: readFileFromDisk(filePath, offset, length);
}
+/**
+ * Read file content without status messages for internal operations
+ * This function preserves exact file content including original line endings,
+ * which is essential for edit operations that need to maintain file formatting.
+ * @param filePath Path to the file
+ * @param offset Starting line number to read from (default: 0)
+ * @param length Maximum number of lines to read (default: from config or 1000)
+ * @returns File content without status headers, with preserved line endings
+ */
+export async function readFileInternal(filePath: string, offset: number = 0, length?: number): Promise {
+ // Get default length from config if not provided
+ if (length === undefined) {
+ const config = await configManager.getConfig();
+ length = config.fileReadLineLimit ?? 1000;
+ }
+
+ const validPath = await validatePath(filePath);
+
+ // Get file extension and MIME type
+ const fileExtension = path.extname(validPath).toLowerCase();
+ const { getMimeType, isImageFile } = await import('./mime-types.js');
+ const mimeType = getMimeType(validPath);
+ const isImage = isImageFile(mimeType);
+
+ if (isImage) {
+ throw new Error('Cannot read image files as text for internal operations');
+ }
+
+ // IMPORTANT: For internal operations (especially edit operations), we must
+ // preserve exact file content including original line endings.
+ // We cannot use readline-based reading as it strips line endings.
+
+ // Read entire file content preserving line endings
+ const content = await fs.readFile(validPath, 'utf8');
+
+ // If we need to apply offset/length, do it while preserving line endings
+ if (offset === 0 && length >= Number.MAX_SAFE_INTEGER) {
+ // Most common case for edit operations: read entire file
+ return content;
+ }
+
+ // Handle offset/length by splitting on line boundaries while preserving line endings
+ const lines = splitLinesPreservingEndings(content);
+
+ // Apply offset and length
+ const selectedLines = lines.slice(offset, offset + length);
+
+ // Join back together (this preserves the original line endings)
+ return selectedLines.join('');
+}
+
+/**
+ * Split text into lines while preserving original line endings with each line
+ * @param content The text content to split
+ * @returns Array of lines, each including its original line ending
+ */
+function splitLinesPreservingEndings(content: string): string[] {
+ if (!content) return [''];
+
+ const lines: string[] = [];
+ let currentLine = '';
+
+ for (let i = 0; i < content.length; i++) {
+ const char = content[i];
+ currentLine += char;
+
+ // Check for line ending patterns
+ if (char === '\n') {
+ // LF or end of CRLF
+ lines.push(currentLine);
+ currentLine = '';
+ } else if (char === '\r') {
+ // Could be CR or start of CRLF
+ if (i + 1 < content.length && content[i + 1] === '\n') {
+ // It's CRLF, include the \n as well
+ currentLine += content[i + 1];
+ i++; // Skip the \n in next iteration
+ }
+ // Either way, we have a complete line
+ lines.push(currentLine);
+ currentLine = '';
+ }
+ }
+
+ // Handle any remaining content (file not ending with line ending)
+ if (currentLine) {
+ lines.push(currentLine);
+ }
+
+ return lines;
+}
+
export async function writeFile(filePath: string, content: string, mode: 'rewrite' | 'append' = 'rewrite'): Promise {
const validPath = await validatePath(filePath);
diff --git a/test/run-all-tests.js b/test/run-all-tests.js
index c195f08..f9340a8 100644
--- a/test/run-all-tests.js
+++ b/test/run-all-tests.js
@@ -1,18 +1,16 @@
/**
* Main test runner script
- * Imports and runs all test modules
+ * Runs all test modules and provides comprehensive summary
*/
import { spawn } from 'child_process';
import path from 'path';
import fs from 'fs/promises';
import { fileURLToPath } from 'url';
-import { createRequire } from 'module';
// Get directory name
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
-const require = createRequire(import.meta.url);
// Colors for console output
const colors = {
@@ -21,7 +19,9 @@ const colors = {
red: '\x1b[31m',
yellow: '\x1b[33m',
blue: '\x1b[34m',
- cyan: '\x1b[36m'
+ cyan: '\x1b[36m',
+ magenta: '\x1b[35m',
+ bold: '\x1b[1m'
};
/**
@@ -51,6 +51,39 @@ function runCommand(command, args, cwd = __dirname) {
});
}
+/**
+ * Run a single Node.js test file as a subprocess
+ */
+function runTestFile(testFile) {
+ return new Promise((resolve) => {
+ console.log(`\n${colors.cyan}Running test module: ${testFile}${colors.reset}`);
+
+ const startTime = Date.now();
+ const proc = spawn('node', [testFile], {
+ cwd: __dirname,
+ stdio: 'inherit',
+ shell: false
+ });
+
+ proc.on('close', (code) => {
+ const duration = Date.now() - startTime;
+ if (code === 0) {
+ console.log(`${colors.green}โ Test passed: ${testFile} (${duration}ms)${colors.reset}`);
+ resolve({ success: true, file: testFile, duration, exitCode: code });
+ } else {
+ console.error(`${colors.red}โ Test failed: ${testFile} (${duration}ms) - Exit code: ${code}${colors.reset}`);
+ resolve({ success: false, file: testFile, duration, exitCode: code });
+ }
+ });
+
+ proc.on('error', (err) => {
+ const duration = Date.now() - startTime;
+ console.error(`${colors.red}โ Error running ${testFile}: ${err.message}${colors.reset}`);
+ resolve({ success: false, file: testFile, duration, error: err.message });
+ });
+ });
+}
+
/**
* Build the project
*/
@@ -60,117 +93,178 @@ async function buildProject() {
}
/**
- * Import and run all test modules
+ * Discover and run all test modules
*/
async function runTestModules() {
console.log(`\n${colors.cyan}===== Running tests =====${colors.reset}\n`);
- // Define static test module paths relative to this file
- // We need to use relative paths with extension for ES modules
- const testModules = [
- './test.js',
- './test-directory-creation.js',
- './test-allowed-directories.js',
- './test-blocked-commands.js',
- './test-default-shell.js',
- './test-edit-block-line-endings.js',
- './test-edit-block-occurrences.js',
- './test-error-sanitization.js',
- './test-home-directory.js',
- './test-search-code.js',
- './test-search-code-edge-cases.js'
- ];
-
- // Dynamically find additional test files (optional)
- // Use the current directory (no need for a subdirectory)
+ // Discover all test files
+ let testFiles = [];
try {
const files = await fs.readdir(__dirname);
- for (const file of files) {
- // Only include files that aren't already in the testModules list
- if (file.startsWith('test-') && file.endsWith('.js') && !testModules.includes(`./${file}`)) {
- testModules.push(`./${file}`);
- }
+
+ // Get all test files, starting with 'test' and ending with '.js'
+ const discoveredTests = files
+ .filter(file => file.startsWith('test') && file.endsWith('.js') && file !== 'run-all-tests.js')
+ .sort(); // Sort for consistent order
+
+ // Ensure main test.js runs first if it exists
+ if (discoveredTests.includes('test.js')) {
+ testFiles.push('./test.js');
+ discoveredTests.splice(discoveredTests.indexOf('test.js'), 1);
}
+
+ // Add remaining tests
+ testFiles.push(...discoveredTests.map(file => `./${file}`));
+
} catch (error) {
- console.warn(`${colors.yellow}Warning: Could not scan test directory: ${error.message}${colors.reset}`);
+ console.error(`${colors.red}Error: Could not scan test directory: ${error.message}${colors.reset}`);
+ process.exit(1);
}
+ if (testFiles.length === 0) {
+ console.warn(`${colors.yellow}Warning: No test files found${colors.reset}`);
+ return { success: true, results: [] };
+ }
+
+ console.log(`${colors.blue}Found ${testFiles.length} test files:${colors.reset}`);
+ testFiles.forEach(file => console.log(` - ${file}`));
+ console.log('');
+
// Results tracking
- let passed = 0;
- let failed = 0;
- const failedTests = [];
+ const results = [];
+ let totalDuration = 0;
- // Import and run each test module
- for (const modulePath of testModules) {
- try {
- console.log(`\n${colors.cyan}Running test module: ${modulePath}${colors.reset}`);
-
- // Dynamic import of the test module
- const testModule = await import(modulePath);
-
- // Get the default exported function
- if (typeof testModule.default !== 'function') {
- console.warn(`${colors.yellow}Warning: ${modulePath} does not export a default function${colors.reset}`);
- continue;
+ // Run each test file
+ for (const testFile of testFiles) {
+ const result = await runTestFile(testFile);
+ results.push(result);
+ totalDuration += result.duration || 0;
+ }
+
+ // Calculate summary statistics
+ const passed = results.filter(r => r.success).length;
+ const failed = results.filter(r => !r.success).length;
+ const failedTests = results.filter(r => !r.success);
+
+ // Print detailed summary
+ console.log(`\n${colors.bold}${colors.cyan}===== TEST SUMMARY =====${colors.reset}\n`);
+
+ // Overall stats
+ console.log(`${colors.bold}Overall Results:${colors.reset}`);
+ console.log(` Total tests: ${passed + failed}`);
+ console.log(` ${colors.green}โ Passed: ${passed}${colors.reset}`);
+ console.log(` ${failed > 0 ? colors.red : colors.green}โ Failed: ${failed}${colors.reset}`);
+ console.log(` Total duration: ${totalDuration}ms (${(totalDuration / 1000).toFixed(1)}s)`);
+
+ // Failed tests details
+ if (failed > 0) {
+ console.log(`\n${colors.red}${colors.bold}Failed Tests:${colors.reset}`);
+ failedTests.forEach(test => {
+ console.log(` ${colors.red}โ ${test.file}${colors.reset}`);
+ if (test.exitCode !== undefined) {
+ console.log(` Exit code: ${test.exitCode}`);
}
-
- // Execute the test
- const success = await testModule.default();
-
- if (success) {
- console.log(`${colors.green}โ Test passed: ${modulePath}${colors.reset}`);
- passed++;
- } else {
- console.error(`${colors.red}โ Test failed: ${modulePath}${colors.reset}`);
- failed++;
- failedTests.push(modulePath);
+ if (test.error) {
+ console.log(` Error: ${test.error}`);
}
- } catch (error) {
- console.error(`${colors.red}โ Error importing or running ${modulePath}: ${error.message}${colors.reset}`);
- failed++;
- failedTests.push(modulePath);
- }
+ });
}
- // Print summary
- console.log(`\n${colors.cyan}===== Test Summary =====${colors.reset}\n`);
- console.log(`Total tests: ${passed + failed}`);
- console.log(`${colors.green}Passed: ${passed}${colors.reset}`);
+ // Test performance summary
+ if (results.length > 0) {
+ console.log(`\n${colors.bold}Performance Summary:${colors.reset}`);
+ const avgDuration = totalDuration / results.length;
+ const slowestTest = results.reduce((prev, current) =>
+ (current.duration || 0) > (prev.duration || 0) ? current : prev
+ );
+ const fastestTest = results.reduce((prev, current) =>
+ (current.duration || 0) < (prev.duration || 0) ? current : prev
+ );
+
+ console.log(` Average test duration: ${avgDuration.toFixed(0)}ms`);
+ console.log(` Fastest test: ${fastestTest.file} (${fastestTest.duration || 0}ms)`);
+ console.log(` Slowest test: ${slowestTest.file} (${slowestTest.duration || 0}ms)`);
+ }
- if (failed > 0) {
- console.log(`${colors.red}Failed: ${failed}${colors.reset}`);
- console.log(`\nFailed tests:`);
- failedTests.forEach(test => console.log(`${colors.red}- ${test}${colors.reset}`));
- return false;
+ // Final status
+ if (failed === 0) {
+ console.log(`\n${colors.green}${colors.bold}๐ ALL TESTS PASSED! ๐${colors.reset}`);
+ console.log(`${colors.green}All ${passed} tests completed successfully.${colors.reset}`);
} else {
- console.log(`\n${colors.green}All tests passed! ๐${colors.reset}`);
- return true;
+ console.log(`\n${colors.red}${colors.bold}โ TESTS FAILED โ${colors.reset}`);
+ console.log(`${colors.red}${failed} out of ${passed + failed} tests failed.${colors.reset}`);
}
+
+ console.log(`\n${colors.cyan}===== Test run completed =====${colors.reset}\n`);
+
+ return {
+ success: failed === 0,
+ results,
+ summary: {
+ total: passed + failed,
+ passed,
+ failed,
+ duration: totalDuration
+ }
+ };
}
/**
* Main function
*/
async function main() {
+ const overallStartTime = Date.now();
+
try {
- console.log(`${colors.cyan}===== Starting test runner =====\n${colors.reset}`);
+ console.log(`${colors.bold}${colors.cyan}===== DESKTOP COMMANDER TEST RUNNER =====${colors.reset}`);
+ console.log(`${colors.blue}Starting test execution at ${new Date().toISOString()}${colors.reset}\n`);
// Build the project first
await buildProject();
// Run all test modules
- const success = await runTestModules();
+ const testResult = await runTestModules();
+
+ // Final timing
+ const overallDuration = Date.now() - overallStartTime;
+ console.log(`${colors.blue}Total execution time: ${overallDuration}ms (${(overallDuration / 1000).toFixed(1)}s)${colors.reset}`);
// Exit with appropriate code
- process.exit(success ? 0 : 1);
+ process.exit(testResult.success ? 0 : 1);
+
} catch (error) {
- console.error(`${colors.red}Error: ${error.message}${colors.reset}`);
+ console.error(`\n${colors.red}${colors.bold}FATAL ERROR:${colors.reset}`);
+ console.error(`${colors.red}${error.message}${colors.reset}`);
+ if (error.stack) {
+ console.error(`${colors.red}${error.stack}${colors.reset}`);
+ }
process.exit(1);
}
}
+// Handle uncaught errors gracefully
+process.on('uncaughtException', (error) => {
+ console.error(`\n${colors.red}${colors.bold}UNCAUGHT EXCEPTION:${colors.reset}`);
+ console.error(`${colors.red}${error.message}${colors.reset}`);
+ if (error.stack) {
+ console.error(`${colors.red}${error.stack}${colors.reset}`);
+ }
+ process.exit(1);
+});
+
+process.on('unhandledRejection', (reason, promise) => {
+ console.error(`\n${colors.red}${colors.bold}UNHANDLED REJECTION:${colors.reset}`);
+ console.error(`${colors.red}${reason}${colors.reset}`);
+ process.exit(1);
+});
+
// Run the main function
main().catch(error => {
- console.error(`${colors.red}Unhandled error: ${error}${colors.reset}`);
+ console.error(`\n${colors.red}${colors.bold}MAIN FUNCTION ERROR:${colors.reset}`);
+ console.error(`${colors.red}${error.message}${colors.reset}`);
+ if (error.stack) {
+ console.error(`${colors.red}${error.stack}${colors.reset}`);
+ }
process.exit(1);
});
diff --git a/test/test-edit-block-line-endings.js b/test/test-edit-block-line-endings.js
index f77fa59..55b1b92 100644
--- a/test/test-edit-block-line-endings.js
+++ b/test/test-edit-block-line-endings.js
@@ -448,6 +448,7 @@ export default async function runTests() {
try {
originalConfig = await setup();
await runEditBlockLineEndingTests();
+ return true;
} catch (error) {
console.error('โ Test failed:', error.message);
return false;
@@ -456,13 +457,14 @@ export default async function runTests() {
await teardown(originalConfig);
}
}
- return true;
}
// If this file is run directly (not imported), execute the test
if (import.meta.url === `file://${process.argv[1]}`) {
- runTests().catch(error => {
+ runTests().then(success => {
+ process.exit(success ? 0 : 1);
+ }).catch(error => {
console.error('โ Unhandled error:', error);
process.exit(1);
});
-}
+}
\ No newline at end of file
diff --git a/test/test-negative-offset-analysis.js b/test/test-negative-offset-analysis.js
new file mode 100644
index 0000000..7058cce
--- /dev/null
+++ b/test/test-negative-offset-analysis.js
@@ -0,0 +1,36 @@
+/**
+ * Test Results: Negative Offset Analysis for read_file
+ *
+ * FINDINGS:
+ * โ Negative offsets DO NOT work correctly in the current implementation
+ * โ They return empty content due to invalid slice() range calculations
+ * โ ๏ธ The implementation has a bug when handling negative offsets
+ *
+ * CURRENT BEHAVIOR:
+ * - offset: -2, length: 5 โ slice(-2, 3) โ returns empty []
+ * - offset: -100, length: undefined โ slice(-100, undefined) โ works by accident
+ *
+ * RECOMMENDATION:
+ * Either fix the implementation to properly support negative offsets,
+ * or add validation to reject them with a clear error message.
+ */
+
+console.log("๐ NEGATIVE OFFSET BEHAVIOR ANALYSIS");
+console.log("====================================");
+console.log("");
+console.log("โ CONCLUSION: Negative offsets are BROKEN in current implementation");
+console.log("");
+console.log("๐ BUG DETAILS:");
+console.log(" Current code: Math.min(offset, totalLines) creates invalid ranges");
+console.log(" Example: offset=-2, totalLines=6 โ slice(-2, 3) โ empty result");
+console.log("");
+console.log("โ ACCIDENTAL SUCCESS:");
+console.log(" My original attempt worked because length was undefined");
+console.log(" slice(-100, undefined) โ slice(-100) โ works correctly");
+console.log("");
+console.log("๐ง NEEDS FIX:");
+console.log(" Either implement proper negative offset support or reject them");
+
+export default async function runTests() {
+ return false; // Test documents that negative offsets are broken
+}
\ No newline at end of file
diff --git a/test/test-negative-offset-readfile.js b/test/test-negative-offset-readfile.js
new file mode 100644
index 0000000..06b06b6
--- /dev/null
+++ b/test/test-negative-offset-readfile.js
@@ -0,0 +1,298 @@
+/**
+ * Test script for negative offset handling in read_file
+ *
+ * This script tests:
+ * 1. Whether negative offsets work correctly (like Unix tail)
+ * 2. How the tool handles edge cases with negative offsets
+ * 3. Comparison with positive offset behavior
+ * 4. Error handling for invalid parameters
+ */
+
+import { configManager } from '../dist/config-manager.js';
+import { handleReadFile } from '../dist/handlers/filesystem-handlers.js';
+import fs from 'fs/promises';
+import path from 'path';
+import { fileURLToPath } from 'url';
+import assert from 'assert';
+
+// Get directory name
+const __filename = fileURLToPath(import.meta.url);
+const __dirname = path.dirname(__filename);
+
+// Define test paths
+const TEST_FILE = path.join(__dirname, 'test-negative-offset.txt');
+
+/**
+ * Setup function to prepare test environment
+ */
+async function setup() {
+ console.log('๐ง Setting up negative offset test...');
+
+ // Save original config to restore later
+ const originalConfig = await configManager.getConfig();
+
+ // Set allowed directories to include test directory
+ await configManager.setValue('allowedDirectories', [__dirname]);
+
+ // Create test file with numbered lines for easy verification
+ const testLines = [];
+ for (let i = 1; i <= 50; i++) {
+ testLines.push(`Line ${i}: This is line number ${i} in the test file.`);
+ }
+ const testContent = testLines.join('\n');
+
+ await fs.writeFile(TEST_FILE, testContent, 'utf8');
+ console.log(`โ Created test file with 50 lines: ${TEST_FILE}`);
+
+ return originalConfig;
+}
+
+/**
+ * Teardown function to clean up after tests
+ */
+async function teardown(originalConfig) {
+ console.log('๐งน Cleaning up test environment...');
+
+ // Reset configuration to original
+ await configManager.updateConfig(originalConfig);
+
+ // Remove test file
+ try {
+ await fs.rm(TEST_FILE, { force: true });
+ console.log('โ Test file cleaned up');
+ } catch (error) {
+ console.log('โ ๏ธ Warning: Could not clean up test file:', error.message);
+ }
+}
+
+/**
+ * Test negative offset functionality
+ */
+async function testNegativeOffset() {
+ console.log('\n๐ Testing negative offset behavior...');
+
+ const tests = [
+ {
+ name: 'Negative offset -10 (last 10 lines)',
+ args: { path: TEST_FILE, offset: -10, length: 20 },
+ expectLines: ['Line 41:', 'Line 42:', 'Line 43:', 'Line 44:', 'Line 45:', 'Line 46:', 'Line 47:', 'Line 48:', 'Line 49:', 'Line 50:']
+ },
+ {
+ name: 'Negative offset -5 (last 5 lines)',
+ args: { path: TEST_FILE, offset: -5, length: 10 },
+ expectLines: ['Line 46:', 'Line 47:', 'Line 48:', 'Line 49:', 'Line 50:']
+ },
+ {
+ name: 'Negative offset -1 (last 1 line)',
+ args: { path: TEST_FILE, offset: -1, length: 5 },
+ expectLines: ['Line 50:']
+ },
+ {
+ name: 'Large negative offset -100 (beyond file size)',
+ args: { path: TEST_FILE, offset: -100, length: 10 },
+ expectLines: ['Line 1:', 'Line 2:', 'Line 3:', 'Line 4:', 'Line 5:', 'Line 6:', 'Line 7:', 'Line 8:', 'Line 9:', 'Line 10:']
+ }
+ ];
+
+ let passedTests = 0;
+
+ for (const test of tests) {
+ console.log(`\n ๐งช ${test.name}`);
+
+ try {
+ const result = await handleReadFile(test.args);
+
+ if (result.isError) {
+ console.log(` โ Error: ${result.content[0].text}`);
+ continue;
+ }
+
+ const content = result.content[0].text;
+ console.log(` ๐ Result (first 200 chars): ${content.substring(0, 200)}...`);
+
+ // Check if expected lines are present
+ let foundExpected = 0;
+ for (const expectedLine of test.expectLines) {
+ if (content.includes(expectedLine)) {
+ foundExpected++;
+ }
+ }
+
+ if (foundExpected === test.expectLines.length) {
+ console.log(` โ PASS: Found all ${foundExpected} expected lines`);
+ passedTests++;
+ } else {
+ console.log(` โ FAIL: Found only ${foundExpected}/${test.expectLines.length} expected lines`);
+ console.log(` Expected: ${test.expectLines.join(', ')}`);
+ }
+
+ } catch (error) {
+ console.log(` โ Exception: ${error.message}`);
+ }
+ }
+
+ return passedTests === tests.length;
+}
+
+/**
+ * Test comparison between negative and positive offsets
+ */
+async function testOffsetComparison() {
+ console.log('\n๐ Testing offset comparison (negative vs positive)...');
+
+ try {
+ // Test reading last 5 lines with negative offset
+ const negativeResult = await handleReadFile({
+ path: TEST_FILE,
+ offset: -5,
+ length: 10
+ });
+
+ // Test reading same lines with positive offset (45 to get last 5 lines of 50)
+ const positiveResult = await handleReadFile({
+ path: TEST_FILE,
+ offset: 45,
+ length: 5
+ });
+
+ if (negativeResult.isError || positiveResult.isError) {
+ console.log(' โ One or both requests failed');
+ return false;
+ }
+
+ const negativeContent = negativeResult.content[0].text;
+ const positiveContent = positiveResult.content[0].text;
+
+ console.log(' ๐ Negative offset result:');
+ console.log(` ${negativeContent.split('\n').slice(2, 4).join('\\n')}`); // Skip header lines
+
+ console.log(' ๐ Positive offset result:');
+ console.log(` ${positiveContent.split('\n').slice(2, 4).join('\\n')}`); // Skip header lines
+
+ // Extract actual content lines (skip informational headers)
+ const negativeLines = negativeContent.split('\n').filter(line => line.startsWith('Line '));
+ const positiveLines = positiveContent.split('\n').filter(line => line.startsWith('Line '));
+
+ const isMatching = negativeLines.join('\\n') === positiveLines.join('\\n');
+
+ if (isMatching) {
+ console.log(' โ PASS: Negative and positive offsets return same content');
+ return true;
+ } else {
+ console.log(' โ FAIL: Negative and positive offsets return different content');
+ console.log(` Negative: ${negativeLines.slice(0, 2).join(', ')}`);
+ console.log(` Positive: ${positiveLines.slice(0, 2).join(', ')}`);
+ return false;
+ }
+
+ } catch (error) {
+ console.log(` โ Exception during comparison: ${error.message}`);
+ return false;
+ }
+}
+
+/**
+ * Test edge cases and error handling
+ */
+async function testEdgeCases() {
+ console.log('\n๐ Testing edge cases...');
+
+ const edgeTests = [
+ {
+ name: 'Zero offset with length',
+ args: { path: TEST_FILE, offset: 0, length: 3 },
+ shouldPass: true
+ },
+ {
+ name: 'Very large negative offset',
+ args: { path: TEST_FILE, offset: -1000, length: 5 },
+ shouldPass: true // Should handle gracefully
+ },
+ {
+ name: 'Negative offset with zero length',
+ args: { path: TEST_FILE, offset: -5, length: 0 },
+ shouldPass: true // Should return empty or minimal content
+ }
+ ];
+
+ let passedEdgeTests = 0;
+
+ for (const test of edgeTests) {
+ console.log(`\n ๐งช ${test.name}`);
+
+ try {
+ const result = await handleReadFile(test.args);
+
+ if (result.isError && test.shouldPass) {
+ console.log(` โ Unexpected error: ${result.content[0].text}`);
+ } else if (!result.isError && test.shouldPass) {
+ console.log(` โ PASS: Handled gracefully`);
+ console.log(` ๐ Result length: ${result.content[0].text.length} chars`);
+ passedEdgeTests++;
+ } else if (result.isError && !test.shouldPass) {
+ console.log(` โ PASS: Expected error occurred`);
+ passedEdgeTests++;
+ }
+
+ } catch (error) {
+ if (test.shouldPass) {
+ console.log(` โ Unexpected exception: ${error.message}`);
+ } else {
+ console.log(` โ PASS: Expected exception occurred`);
+ passedEdgeTests++;
+ }
+ }
+ }
+
+ return passedEdgeTests === edgeTests.length;
+}
+
+/**
+ * Main test runner
+ */
+async function runAllTests() {
+ console.log('๐งช Starting negative offset read_file tests...\n');
+
+ let originalConfig;
+ let allTestsPassed = true;
+
+ try {
+ originalConfig = await setup();
+
+ // Run all test suites
+ const negativeOffsetPassed = await testNegativeOffset();
+ const comparisonPassed = await testOffsetComparison();
+ const edgeCasesPassed = await testEdgeCases();
+
+ allTestsPassed = negativeOffsetPassed && comparisonPassed && edgeCasesPassed;
+
+ console.log('\n๐ Test Results Summary:');
+ console.log(` Negative offset tests: ${negativeOffsetPassed ? 'โ PASS' : 'โ FAIL'}`);
+ console.log(` Comparison tests: ${comparisonPassed ? 'โ PASS' : 'โ FAIL'}`);
+ console.log(` Edge case tests: ${edgeCasesPassed ? 'โ PASS' : 'โ FAIL'}`);
+ console.log(`\n๐ฏ Overall result: ${allTestsPassed ? 'โ ALL TESTS PASSED!' : 'โ SOME TESTS FAILED'}`);
+
+ } catch (error) {
+ console.error('โ Test setup/execution failed:', error.message);
+ allTestsPassed = false;
+ } finally {
+ if (originalConfig) {
+ await teardown(originalConfig);
+ }
+ }
+
+ return allTestsPassed;
+}
+
+// Export the main test function
+export default runAllTests;
+
+// If this file is run directly (not imported), execute the test
+if (import.meta.url === `file://${process.argv[1]}`) {
+ runAllTests().then(success => {
+ process.exit(success ? 0 : 1);
+ }).catch(error => {
+ console.error('โ Unhandled error:', error);
+ process.exit(1);
+ });
+}
\ No newline at end of file