Skip to content

Fix: Resolve Wiki Structure Timeout Issues for Complex Repositories #273

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

dennisdornon
Copy link

Summary

This PR fixes the critical timeout issue where wiki structure determination was prematurely timing out after 5 minutes when complex repositories required 20+ minutes of processing time. The issue was caused by hardcoded timeout caps in the frontend that overrode the backend's dynamic timeout calculations.

Key Changes Made

🔧 Frontend Timeout Logic Updates (src/app/[owner]/[repo]/page.tsx)

  • Removed hardcoded 5-minute caps on per-page timeouts that were preventing longer processing times
  • Increased timeout thresholds for extra-large repositories from 15 minutes to 30 minutes
  • Replaced hardcoded defaults with configurable environment variables
  • Added safety bounds with minimum 5 minutes and maximum 2 hours (configurable)
  • Implemented timeout adjustment logging for better debugging visibility

🌍 Environment Variable Support

  • NEXT_PUBLIC_MAX_PROCESSING_TIMEOUT - Maximum global processing timeout (default: 2 hours)
  • NEXT_PUBLIC_MAX_PAGE_TIMEOUT - Maximum per-page timeout (default: 15 minutes)
  • NEXT_PUBLIC_DEFAULT_TIMEOUT - Default when complexity analysis fails (default: 10 minutes)
  • NEXT_PUBLIC_TIMEOUT_XLARGE - Extra-large repository threshold (default: 30 minutes)
  • Additional size-based threshold configurations for small/medium/large repositories

📋 Documentation

  • Created comprehensive documentation (TIMEOUT_FIX_DOCUMENTATION.md) explaining the fix
  • Environment variable examples with detailed descriptions of each timeout setting

Problem Solved

Before: Backend calculates 20-minute timeout → Frontend caps at 5 minutes → Premature timeout error
After: Backend calculates 20-minute timeout → Frontend respects recommendation → Successful processing

Key Benefits

Eliminates premature timeouts for complex repositories requiring extended processing
Maintains performance for simple repositories with appropriate shorter timeouts
Provides configurability through environment variables for different deployment needs
Ensures safety with reasonable minimum and maximum bounds
Backward compatible - works without any environment variables using sensible defaults
Better debugging with timeout adjustment logging

Testing Verification

  • ESLint passes with only pre-existing warnings (React hooks dependencies)
  • Next.js build successful - no TypeScript compilation errors
  • Environment variables properly typed and validated with parseInt()
  • Safety bounds validated - minimum 5 minutes, maximum 2 hours by default
  • Backward compatibility confirmed - maintains existing behavior without env vars

Technical Details

Timeout Flow

  1. Backend Analysis: api/data_pipeline.py analyzes repository complexity and recommends timeouts
  2. Frontend Respect: Frontend now uses these recommendations instead of hardcoded caps
  3. Safety Validation: Timeouts are bounded by configurable min/max values
  4. Fallback Handling: Graceful fallback to increased defaults when dynamic calculation fails

Configuration Hierarchy

  1. Dynamic calculation from backend complexity analysis (preferred)
  2. Environment variable configuration for custom deployment needs
  3. Sensible defaults for zero-configuration deployment

Breaking Changes

None - this is a backward compatible fix that maintains existing behavior while enabling longer timeouts when needed.

Related Issues

Resolves timeout error: "Wiki structure determination timed out after 5 minutes. Please try again with a smaller repository or enable file filtering."

- Remove --turbopack flag from dev script to resolve font loading errors
- Update yarn.lock with dependency changes
- Enables successful development server startup on port 3001
- Enhanced preprocessing function with diagram-type-specific cleaning
- Added activation state management to prevent "inactive participant" errors
- Implemented 4-level error recovery system:
  1. Enhanced preprocessing with syntax fixes
  2. Aggressive preprocessing with diagram reconstruction
  3. Emergency activation fix removing problematic commands
  4. Fallback error diagrams for complete failures
- Fixed sequence diagram arrow syntax issues (double arrows)
- Added comprehensive regex patterns for malformed syntax detection
- Improved error handling with graceful degradation
- Added project documentation and development utilities

Changes resolve multiple Mermaid parsing errors while maintaining
diagram functionality and providing better user experience.
- Remove all Google Fonts CDN dependencies (Noto Sans JP, Noto Serif JP, Geist Mono)
- Replace with system font stack for better performance and privacy
- Remove Font Awesome CDN links from slides page (3 instances)
- Update globals.css to use system fonts: -apple-system, BlinkMacSystemFont, etc.
- Add comprehensive timeout safety mechanisms for wiki structure generation
- Improve error messages for private repository access with specific guidance
- Fix loading state management to prevent infinite loading scenarios
- Enhance XML parsing error handling with better user feedback
- Set repository input field to blank by default (UX improvement)

Technical improvements:
- Added 5-minute timeout for wiki structure determination
- Enhanced error recovery in confirmRefresh function
- Better async error handling in loadData function
- Improved private repo error detection and messaging
- Eliminated external network dependencies for fonts

Benefits:
- Faster page loading (no external font requests)
- Better privacy (no third-party CDN calls)
- Improved offline functionality
- Enhanced error handling and user feedback
- More reliable font rendering across platforms
… approach

This commit fixes critical issues in the embedding pipeline that were causing
repository processing failures with the error "No valid document embeddings found".

## Root Cause Analysis
- ToEmbeddings component worked correctly in isolation but failed to transfer
  embeddings to Document objects in the integrated workflow
- Documents were returned with empty vector arrays despite successful API calls
- Issue appeared to be related to uvicorn worker process environment inheritance

## Key Changes

### Backend Fixes (api/)
- **data_pipeline.py**: Implemented direct embedder approach with batch processing
  to bypass broken ToEmbeddings component. Added comprehensive debugging and
  error handling for embedding pipeline failures.
- **main.py**: Temporarily disabled uvicorn reload mode to fix environment
  variable inheritance issues affecting worker processes.
- **openai_client.py**: Fixed client deserialization issues and added lazy
  initialization for sync_client to prevent startup errors.
- **rag.py**: Enhanced error messages for embedding validation failures with
  specific user guidance and troubleshooting information.
- **tools/embedder.py**: Added load_dotenv() to ensure environment variables
  are available in worker processes.
- **websocket_wiki.py**: Improved error handling and logging for WebSocket
  communication during wiki generation.

### Frontend Fixes (src/)
- **[owner]/[repo]/page.tsx**: Fixed GitHub API authorization header format
  from "Bearer" to "token" format for proper private repository access.

## Technical Details
- Implemented direct OpenAI embeddings API calls with batch processing
- Added comprehensive validation for embedding vectors before database storage
- Enhanced logging throughout the pipeline for better debugging
- Fixed environment variable inheritance in uvicorn worker processes
- Improved error handling and user feedback for embedding failures

## Testing
- Successfully generated complete wiki for MainWP Bulk Settings Manager Extension
- Processed 229 documents with proper embeddings
- Created comprehensive multi-page wiki structure
- Verified end-to-end functionality from repository access to wiki generation

Fixes private repository processing issues and ensures reliable embedding
generation for all supported repository types.
- Fix OpenAI o-series model temperature configuration:
  * Set temperature to 1.0 for o1, o4-mini models (was 0.7)
  * Remove unsupported top_p parameters for reasoning models
  * Applies to both direct OpenAI and OpenRouter providers
  * Resolves 400 Bad Request errors from OpenAI API

- Add missing Next.js API route rewrites:
  * /api/models/config → /models/config
  * /api/wiki/projects → /api/processed_projects
  * Fixes 500 HTTP errors in frontend components
  * Ensures proper frontend-backend API communication

This resolves the web interface errors where users couldn't process
repositories due to API configuration mismatches. The OpenAI reasoning
models (o1, o3, o4) only support temperature=1.0 and reject other values.
- Fix Font Awesome CDN removal breaking slide icons
- Remove redundant environment variable assignments
- Clean up unused init_worker function and import
- Fix unused exception variable in openai_client.py
- Secure debug logs to prevent token exposure

All changes improve code quality, security, and functionality
while maintaining existing behavior.
Fix OpenAI API temperature errors and missing Next.js rewrites
- Remove hardcoded 5-minute caps on processing timeouts
- Add configurable environment variables for timeout control
- Increase timeout thresholds for extra-large repositories (30min)
- Implement safety bounds with min 5min, max 2hr limits
- Add timeout adjustment logging for debugging
- Maintain backward compatibility with reasonable defaults

Fixes premature timeouts when backend recommends 20+ minute processing
times for complex repositories. Now respects dynamic timeout calculations
instead of overriding with hardcoded 5-minute limits.
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @dennisdornon, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request primarily addresses critical timeout issues that prevented wiki generation for complex repositories by implementing dynamic and configurable timeout mechanisms across the frontend and backend. It significantly enhances the overall robustness of the wiki generation process, improves error handling and user feedback, and makes Mermaid diagram rendering more resilient to syntax errors, leading to a more stable and user-friendly application experience.

Highlights

  • Dynamic Timeout Management: I've updated the frontend to remove hardcoded 5-minute timeout caps, allowing it to respect dynamic timeout recommendations from the backend. This includes increasing thresholds for extra-large repositories and introducing configurable environment variables for all timeout settings (global, per-page, and size-based thresholds), ensuring complex repositories can complete processing.
  • Enhanced Wiki Generation Robustness: The wiki structure determination process now includes retry logic for initial generation failures, improved validation and auto-correction of generated pages (e.g., removing empty titles, fixing related page references), and automatic section generation if the AI doesn't provide them in comprehensive view. Partial progress is also saved to local storage to prevent data loss on timeouts.
  • Improved Mermaid Diagram Rendering: Mermaid diagram rendering has been significantly enhanced to handle syntax errors more gracefully. I've implemented multiple levels of preprocessing (normal, aggressive, emergency fixes for activation issues) and a fallback mechanism to display a generic error diagram, preventing the UI from breaking due to malformed diagrams. Warnings are now displayed when a diagram is simplified or a fallback is used.
  • Comprehensive Error Handling & User Feedback: I've added repository access validation in the backend and improved error messages across the application, providing more specific feedback for issues like invalid access tokens, private repositories, embedding failures, and API errors. Progress tracking with estimated time remaining is now displayed during page generation, enhancing the user experience for long-running processes.
  • Backend Stability & Configuration: The backend now includes repository access validation before cloning, improved logging for embedder setup, and a refactored embedding pipeline with manual execution and persistence verification to address potential issues. Environment variables are now explicitly loaded and defaulted, and uvicorn reload is temporarily disabled for stability.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses the critical timeout issue for complex repositories. The addition of configurable environment variables, safety bounds, and detailed logging is a significant improvement. I've highlighted a few areas for improvement, mainly concerning backend code clarity and some potentially unintended changes. Once these points are addressed, this will be an excellent merge.

Comment on lines +48 to +49
# TEMPORARY: Disable reload to fix environment variable issues
is_development = False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The temporary disabling of hot-reloading for the development server should be reverted to ensure the development workflow is not broken for other contributors.

Suggested change
# TEMPORARY: Disable reload to fix environment variable issues
is_development = False

Comment on lines +540 to 764
logger.info(f"LocalDB has set_transformed_data: {hasattr(db, 'set_transformed_data')}")

# Use the proper LocalDB API to store transformed data
logger.info("Using proper LocalDB transformed_items storage")

# Store the data using the LocalDB's internal transformed_items structure
if not hasattr(db, 'transformed_items'):
db.transformed_items = {}
db.transformed_items["split_and_embed"] = embedded_docs

# Debug: Check what keys are now in the database
logger.info(f"transformed_items keys: {list(db.transformed_items.keys())}")

# Manually save the database to persist the data
db.save_state(filepath=db_path)
logger.info("Database saved to disk")

# Force a fresh database load to verify persistence
db_reloaded = LocalDB(db_path)
logger.info(f"Reloaded DB has transformed_items: {hasattr(db_reloaded, 'transformed_items')}")

# Check if the reloaded database has the transformed data structure
if hasattr(db_reloaded, 'transformed_items'):
logger.info(f"Reloaded DB transformed_items keys: {list(db_reloaded.transformed_items.keys())}")
else:
logger.error("Reloaded DB has no transformed_items attribute")

# Try to get the transformed data using the proper API
try:
reloaded_docs = db_reloaded.fetch_transformed_items(key="split_and_embed")
if reloaded_docs:
logger.info(f"Verification: Successfully persisted {len(reloaded_docs)} documents")
# Replace the original db with the reloaded one to ensure consistency
db = db_reloaded
else:
logger.error("Verification failed: No documents found after database reload")
# Still proceed with the original database since we have the data in memory
logger.info("Continuing with original database (data exists in memory)")
except Exception as get_error:
logger.error(f"Error getting transformed data: {get_error}")
# Still proceed with the original database since we have the data in memory
logger.info("Continuing with original database (data exists in memory)")

except Exception as e:
logger.error(f"Error storing documents in database: {e}")
raise

logger.info("Manually stored embedded documents in database")

# DEBUG: Check embeddings after storing and retrieving
try:
retrieved_docs = db.fetch_transformed_items(key="split_and_embed") if hasattr(db, 'fetch_transformed_items') else db.get_transformed_data(key="split_and_embed")
if retrieved_docs:
for i, doc in enumerate(retrieved_docs[:3]):
vector_present = hasattr(doc, 'vector') and doc.vector is not None
vector_len = len(doc.vector) if vector_present else 0
logger.info(f"After retrieval - Document {i}: vector_present={vector_present}, vector_length={vector_len}")
else:
logger.error("No documents retrieved from database after storage!")
except Exception as e:
logger.error(f"Error retrieving documents for debug: {e}")
logger.info("Continuing with stored documents in memory")

except Exception as e:
logger.error(f"ToEmbeddings failed: {e}")
import traceback
logger.error(f"ToEmbeddings traceback: {traceback.format_exc()}")
raise

logger.info("=== END MANUAL PIPELINE DEBUGGING ===")

# db.transform(key="split_and_embed")

# Check the results immediately after transform
try:
transformed_docs = db.fetch_transformed_items(key="split_and_embed") if hasattr(db, 'fetch_transformed_items') else db.get_transformed_data(key="split_and_embed")
logger.info(f"After transformation: {len(transformed_docs) if transformed_docs else 0} documents")
except Exception as e:
logger.error(f"Error fetching transformed docs: {e}")
transformed_docs = None

if transformed_docs:
# Check first few documents for embeddings
for i, doc in enumerate(transformed_docs[:3]):
has_vector = hasattr(doc, 'vector') and doc.vector and len(doc.vector) > 0
vector_len = len(doc.vector) if hasattr(doc, 'vector') and doc.vector else 0
logger.info(f"Document {i}: has_vector={has_vector}, vector_length={vector_len}")

logger.info("Transformation completed successfully")
except Exception as e:
logger.error(f"Error during transformation: {e}")
import traceback
logger.error(f"Traceback: {traceback.format_exc()}")
raise

# The database has already been saved and reloaded during the direct embedder approach
# Just ensure the directory exists and return the verified database
os.makedirs(os.path.dirname(db_path), exist_ok=True)
db.save_state(filepath=db_path)
return db
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The manual implementation of the document transformation pipeline is complex and contains debugging logs. Refactor the manual pipeline logic into a separate, well-documented helper function or class to improve transform_documents_and_save_to_db's clarity and maintainability.

Comment on lines +56 to +59
if not os.environ.get("OPENAI_API_KEY"):
os.environ["OPENAI_API_KEY"] = ""
if not os.environ.get("GOOGLE_API_KEY"):
os.environ["GOOGLE_API_KEY"] = ""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Setting missing API keys to an empty string can mask configuration issues. Consider removing these default assignments and letting the application raise an error if a key is required but not provided.

Comment on lines +1 to +236
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)


Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2:
... A[User Request (POST/GET)] --> B[ma
----------------------^
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)

Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2:
... A[User Request (POST/GET)] --> B[ma
----------------------^
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error: Parse error on line 2:
... A[User Request (POST/GET)] --> B[ma
----------------------^
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)


Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)

Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)

Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)

Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file contains debug logs or error stack traces and should be removed from the pull request.

Comment on lines +1 to +108
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)

Error 2:
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2:
... A[User Request \(POST/GET\)] --> B[m
-----------------------^
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error 3:
Error: Parse error on line 2:
... A[User Request \(POST/GET\)] --> B[m
-----------------------^
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)

Error 4:
Error: [53.676 : ERROR : "Error parsing" Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)

Error 5:
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error 6:
Error: Parse error on line 2:
...ey_Maker.bar_render() bar_render() -
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)

Error 7:
Error: [53.676 : ERROR : "Error parsing" Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85)


Error 8:
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71)
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54)
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83)

Error 9:
Error: Parse error on line 2:
...a --> flatten_array() flatten_array(
-----------------------^
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE'
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21)
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16)
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23)
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18)
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26)
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18)
at new Promise (<anonymous>)
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128)
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file also seems to be a debug log and should be removed from the repository.

@@ -3,7 +3,7 @@
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev --turbopack --port 3000",
"dev": "next dev --port 3001 --hostname 0.0.0.0",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The dev script has been changed to use --hostname 0.0.0.0, which exposes the development server to the local network. Add a comment explaining the rationale and any security considerations.

@dennisdornon
Copy link
Author

APOLOGIES FOR THE ERROR

This PR was created by mistake due to a configuration error in GitHub CLI. This was intended for my fork (vsbpdev/deepwiki-open) but was accidentally created in the upstream repository.

REQUEST: Please close this PR as it was created in error.

ROOT CAUSE: GitHub CLI defaulted to upstream repository instead of fork
CORRECTIVE ACTION: Implemented safety protocols to prevent this from happening again

I sincerely apologize for the confusion and any inconvenience this may have caused. This type of error will not happen again.

Thank you for your understanding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant