-
Notifications
You must be signed in to change notification settings - Fork 837
Fix: Resolve Wiki Structure Timeout Issues for Complex Repositories #273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Fix: Resolve Wiki Structure Timeout Issues for Complex Repositories #273
Conversation
- Remove --turbopack flag from dev script to resolve font loading errors - Update yarn.lock with dependency changes - Enables successful development server startup on port 3001
- Enhanced preprocessing function with diagram-type-specific cleaning - Added activation state management to prevent "inactive participant" errors - Implemented 4-level error recovery system: 1. Enhanced preprocessing with syntax fixes 2. Aggressive preprocessing with diagram reconstruction 3. Emergency activation fix removing problematic commands 4. Fallback error diagrams for complete failures - Fixed sequence diagram arrow syntax issues (double arrows) - Added comprehensive regex patterns for malformed syntax detection - Improved error handling with graceful degradation - Added project documentation and development utilities Changes resolve multiple Mermaid parsing errors while maintaining diagram functionality and providing better user experience.
- Remove all Google Fonts CDN dependencies (Noto Sans JP, Noto Serif JP, Geist Mono) - Replace with system font stack for better performance and privacy - Remove Font Awesome CDN links from slides page (3 instances) - Update globals.css to use system fonts: -apple-system, BlinkMacSystemFont, etc. - Add comprehensive timeout safety mechanisms for wiki structure generation - Improve error messages for private repository access with specific guidance - Fix loading state management to prevent infinite loading scenarios - Enhance XML parsing error handling with better user feedback - Set repository input field to blank by default (UX improvement) Technical improvements: - Added 5-minute timeout for wiki structure determination - Enhanced error recovery in confirmRefresh function - Better async error handling in loadData function - Improved private repo error detection and messaging - Eliminated external network dependencies for fonts Benefits: - Faster page loading (no external font requests) - Better privacy (no third-party CDN calls) - Improved offline functionality - Enhanced error handling and user feedback - More reliable font rendering across platforms
… approach This commit fixes critical issues in the embedding pipeline that were causing repository processing failures with the error "No valid document embeddings found". ## Root Cause Analysis - ToEmbeddings component worked correctly in isolation but failed to transfer embeddings to Document objects in the integrated workflow - Documents were returned with empty vector arrays despite successful API calls - Issue appeared to be related to uvicorn worker process environment inheritance ## Key Changes ### Backend Fixes (api/) - **data_pipeline.py**: Implemented direct embedder approach with batch processing to bypass broken ToEmbeddings component. Added comprehensive debugging and error handling for embedding pipeline failures. - **main.py**: Temporarily disabled uvicorn reload mode to fix environment variable inheritance issues affecting worker processes. - **openai_client.py**: Fixed client deserialization issues and added lazy initialization for sync_client to prevent startup errors. - **rag.py**: Enhanced error messages for embedding validation failures with specific user guidance and troubleshooting information. - **tools/embedder.py**: Added load_dotenv() to ensure environment variables are available in worker processes. - **websocket_wiki.py**: Improved error handling and logging for WebSocket communication during wiki generation. ### Frontend Fixes (src/) - **[owner]/[repo]/page.tsx**: Fixed GitHub API authorization header format from "Bearer" to "token" format for proper private repository access. ## Technical Details - Implemented direct OpenAI embeddings API calls with batch processing - Added comprehensive validation for embedding vectors before database storage - Enhanced logging throughout the pipeline for better debugging - Fixed environment variable inheritance in uvicorn worker processes - Improved error handling and user feedback for embedding failures ## Testing - Successfully generated complete wiki for MainWP Bulk Settings Manager Extension - Processed 229 documents with proper embeddings - Created comprehensive multi-page wiki structure - Verified end-to-end functionality from repository access to wiki generation Fixes private repository processing issues and ensures reliable embedding generation for all supported repository types.
- Fix OpenAI o-series model temperature configuration: * Set temperature to 1.0 for o1, o4-mini models (was 0.7) * Remove unsupported top_p parameters for reasoning models * Applies to both direct OpenAI and OpenRouter providers * Resolves 400 Bad Request errors from OpenAI API - Add missing Next.js API route rewrites: * /api/models/config → /models/config * /api/wiki/projects → /api/processed_projects * Fixes 500 HTTP errors in frontend components * Ensures proper frontend-backend API communication This resolves the web interface errors where users couldn't process repositories due to API configuration mismatches. The OpenAI reasoning models (o1, o3, o4) only support temperature=1.0 and reject other values.
- Fix Font Awesome CDN removal breaking slide icons - Remove redundant environment variable assignments - Clean up unused init_worker function and import - Fix unused exception variable in openai_client.py - Secure debug logs to prevent token exposure All changes improve code quality, security, and functionality while maintaining existing behavior.
Fix OpenAI API temperature errors and missing Next.js rewrites
- Remove hardcoded 5-minute caps on processing timeouts - Add configurable environment variables for timeout control - Increase timeout thresholds for extra-large repositories (30min) - Implement safety bounds with min 5min, max 2hr limits - Add timeout adjustment logging for debugging - Maintain backward compatibility with reasonable defaults Fixes premature timeouts when backend recommends 20+ minute processing times for complex repositories. Now respects dynamic timeout calculations instead of overriding with hardcoded 5-minute limits.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary of Changes
Hello @dennisdornon, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request primarily addresses critical timeout issues that prevented wiki generation for complex repositories by implementing dynamic and configurable timeout mechanisms across the frontend and backend. It significantly enhances the overall robustness of the wiki generation process, improves error handling and user feedback, and makes Mermaid diagram rendering more resilient to syntax errors, leading to a more stable and user-friendly application experience.
Highlights
- Dynamic Timeout Management: I've updated the frontend to remove hardcoded 5-minute timeout caps, allowing it to respect dynamic timeout recommendations from the backend. This includes increasing thresholds for extra-large repositories and introducing configurable environment variables for all timeout settings (global, per-page, and size-based thresholds), ensuring complex repositories can complete processing.
- Enhanced Wiki Generation Robustness: The wiki structure determination process now includes retry logic for initial generation failures, improved validation and auto-correction of generated pages (e.g., removing empty titles, fixing related page references), and automatic section generation if the AI doesn't provide them in comprehensive view. Partial progress is also saved to local storage to prevent data loss on timeouts.
- Improved Mermaid Diagram Rendering: Mermaid diagram rendering has been significantly enhanced to handle syntax errors more gracefully. I've implemented multiple levels of preprocessing (normal, aggressive, emergency fixes for activation issues) and a fallback mechanism to display a generic error diagram, preventing the UI from breaking due to malformed diagrams. Warnings are now displayed when a diagram is simplified or a fallback is used.
- Comprehensive Error Handling & User Feedback: I've added repository access validation in the backend and improved error messages across the application, providing more specific feedback for issues like invalid access tokens, private repositories, embedding failures, and API errors. Progress tracking with estimated time remaining is now displayed during page generation, enhancing the user experience for long-running processes.
- Backend Stability & Configuration: The backend now includes repository access validation before cloning, improved logging for embedder setup, and a refactored embedding pipeline with manual execution and persistence verification to address potential issues. Environment variables are now explicitly loaded and defaulted, and
uvicorn
reload is temporarily disabled for stability.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command>
or @gemini-code-assist <command>
. Below is a summary of the supported commands.
Feature | Command | Description |
---|---|---|
Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/
folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request addresses the critical timeout issue for complex repositories. The addition of configurable environment variables, safety bounds, and detailed logging is a significant improvement. I've highlighted a few areas for improvement, mainly concerning backend code clarity and some potentially unintended changes. Once these points are addressed, this will be an excellent merge.
# TEMPORARY: Disable reload to fix environment variable issues | ||
is_development = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logger.info(f"LocalDB has set_transformed_data: {hasattr(db, 'set_transformed_data')}") | ||
|
||
# Use the proper LocalDB API to store transformed data | ||
logger.info("Using proper LocalDB transformed_items storage") | ||
|
||
# Store the data using the LocalDB's internal transformed_items structure | ||
if not hasattr(db, 'transformed_items'): | ||
db.transformed_items = {} | ||
db.transformed_items["split_and_embed"] = embedded_docs | ||
|
||
# Debug: Check what keys are now in the database | ||
logger.info(f"transformed_items keys: {list(db.transformed_items.keys())}") | ||
|
||
# Manually save the database to persist the data | ||
db.save_state(filepath=db_path) | ||
logger.info("Database saved to disk") | ||
|
||
# Force a fresh database load to verify persistence | ||
db_reloaded = LocalDB(db_path) | ||
logger.info(f"Reloaded DB has transformed_items: {hasattr(db_reloaded, 'transformed_items')}") | ||
|
||
# Check if the reloaded database has the transformed data structure | ||
if hasattr(db_reloaded, 'transformed_items'): | ||
logger.info(f"Reloaded DB transformed_items keys: {list(db_reloaded.transformed_items.keys())}") | ||
else: | ||
logger.error("Reloaded DB has no transformed_items attribute") | ||
|
||
# Try to get the transformed data using the proper API | ||
try: | ||
reloaded_docs = db_reloaded.fetch_transformed_items(key="split_and_embed") | ||
if reloaded_docs: | ||
logger.info(f"Verification: Successfully persisted {len(reloaded_docs)} documents") | ||
# Replace the original db with the reloaded one to ensure consistency | ||
db = db_reloaded | ||
else: | ||
logger.error("Verification failed: No documents found after database reload") | ||
# Still proceed with the original database since we have the data in memory | ||
logger.info("Continuing with original database (data exists in memory)") | ||
except Exception as get_error: | ||
logger.error(f"Error getting transformed data: {get_error}") | ||
# Still proceed with the original database since we have the data in memory | ||
logger.info("Continuing with original database (data exists in memory)") | ||
|
||
except Exception as e: | ||
logger.error(f"Error storing documents in database: {e}") | ||
raise | ||
|
||
logger.info("Manually stored embedded documents in database") | ||
|
||
# DEBUG: Check embeddings after storing and retrieving | ||
try: | ||
retrieved_docs = db.fetch_transformed_items(key="split_and_embed") if hasattr(db, 'fetch_transformed_items') else db.get_transformed_data(key="split_and_embed") | ||
if retrieved_docs: | ||
for i, doc in enumerate(retrieved_docs[:3]): | ||
vector_present = hasattr(doc, 'vector') and doc.vector is not None | ||
vector_len = len(doc.vector) if vector_present else 0 | ||
logger.info(f"After retrieval - Document {i}: vector_present={vector_present}, vector_length={vector_len}") | ||
else: | ||
logger.error("No documents retrieved from database after storage!") | ||
except Exception as e: | ||
logger.error(f"Error retrieving documents for debug: {e}") | ||
logger.info("Continuing with stored documents in memory") | ||
|
||
except Exception as e: | ||
logger.error(f"ToEmbeddings failed: {e}") | ||
import traceback | ||
logger.error(f"ToEmbeddings traceback: {traceback.format_exc()}") | ||
raise | ||
|
||
logger.info("=== END MANUAL PIPELINE DEBUGGING ===") | ||
|
||
# db.transform(key="split_and_embed") | ||
|
||
# Check the results immediately after transform | ||
try: | ||
transformed_docs = db.fetch_transformed_items(key="split_and_embed") if hasattr(db, 'fetch_transformed_items') else db.get_transformed_data(key="split_and_embed") | ||
logger.info(f"After transformation: {len(transformed_docs) if transformed_docs else 0} documents") | ||
except Exception as e: | ||
logger.error(f"Error fetching transformed docs: {e}") | ||
transformed_docs = None | ||
|
||
if transformed_docs: | ||
# Check first few documents for embeddings | ||
for i, doc in enumerate(transformed_docs[:3]): | ||
has_vector = hasattr(doc, 'vector') and doc.vector and len(doc.vector) > 0 | ||
vector_len = len(doc.vector) if hasattr(doc, 'vector') and doc.vector else 0 | ||
logger.info(f"Document {i}: has_vector={has_vector}, vector_length={vector_len}") | ||
|
||
logger.info("Transformation completed successfully") | ||
except Exception as e: | ||
logger.error(f"Error during transformation: {e}") | ||
import traceback | ||
logger.error(f"Traceback: {traceback.format_exc()}") | ||
raise | ||
|
||
# The database has already been saved and reloaded during the direct embedder approach | ||
# Just ensure the directory exists and return the verified database | ||
os.makedirs(os.path.dirname(db_path), exist_ok=True) | ||
db.save_state(filepath=db_path) | ||
return db |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if not os.environ.get("OPENAI_API_KEY"): | ||
os.environ["OPENAI_API_KEY"] = "" | ||
if not os.environ.get("GOOGLE_API_KEY"): | ||
os.environ["GOOGLE_API_KEY"] = "" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) | ||
|
||
|
||
Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2: | ||
... A[User Request (POST/GET)] --> B[ma | ||
----------------------^ | ||
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
... A[User Request (POST/GET)] --> B[ma | ||
----------------------^ | ||
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error: Parse error on line 2: | ||
... A[User Request (POST/GET)] --> B[ma | ||
----------------------^ | ||
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) | ||
|
||
|
||
Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) | ||
|
||
Error: [55.593 : ERROR : "Error parsing" Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
Error: [55.593 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
Error 2: | ||
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
... A[User Request \(POST/GET\)] --> B[m | ||
-----------------------^ | ||
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error 3: | ||
Error: Parse error on line 2: | ||
... A[User Request \(POST/GET\)] --> B[m | ||
-----------------------^ | ||
Expecting 'SQE', 'DOUBLECIRCLEEND', 'PE', '-)', 'STADIUMEND', 'SUBROUTINEEND', 'PIPE', 'CYLINDEREND', 'DIAMOND_STOP', 'TAGEND', 'TRAPEND', 'INVTRAPEND', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PS' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) | ||
|
||
Error 4: | ||
Error: [53.676 : ERROR : "Error parsing" Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
Error 5: | ||
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error 6: | ||
Error: Parse error on line 2: | ||
...ey_Maker.bar_render() bar_render() - | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) | ||
|
||
Error 7: | ||
Error: [53.676 : ERROR : "Error parsing" Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1277:85) | ||
|
||
|
||
Error 8: | ||
Error: [53.676 : ERROR : "Error executing queue" Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at createConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/console-error.js:27:71) | ||
at handleConsoleError (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/errors/use-error-handler.js:47:54) | ||
at console.error (webpack-internal:///(app-pages-browser)/./node_modules/next/dist/client/components/globals/intercept-console-error.js:47:57) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1242:83) | ||
|
||
Error 9: | ||
Error: Parse error on line 2: | ||
...a --> flatten_array() flatten_array( | ||
-----------------------^ | ||
Expecting 'PS', 'TAGEND', 'STR', 'MD_STR', 'UNICODE_TEXT', 'TEXT', 'TAGSTART', got 'PE' | ||
at Parser.parseError (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1423:21) | ||
at Parser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:1495:16) | ||
at newParser.parse (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/chunks/mermaid.core/flowDiagram-4HSFHLVR.mjs:2297:23) | ||
at Diagram.fromText (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:701:18) | ||
at Object.render (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:988:26) | ||
at eval (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1271:18) | ||
at new Promise (<anonymous>) | ||
at performCall (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1270:128) | ||
at executeQueue (webpack-internal:///(app-pages-browser)/./node_modules/mermaid/dist/mermaid.core.mjs:1240:15) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -3,7 +3,7 @@ | |||
"version": "0.1.0", | |||
"private": true, | |||
"scripts": { | |||
"dev": "next dev --turbopack --port 3000", | |||
"dev": "next dev --port 3001 --hostname 0.0.0.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
APOLOGIES FOR THE ERROR This PR was created by mistake due to a configuration error in GitHub CLI. This was intended for my fork (vsbpdev/deepwiki-open) but was accidentally created in the upstream repository. REQUEST: Please close this PR as it was created in error. ROOT CAUSE: GitHub CLI defaulted to upstream repository instead of fork I sincerely apologize for the confusion and any inconvenience this may have caused. This type of error will not happen again. Thank you for your understanding. |
Summary
This PR fixes the critical timeout issue where wiki structure determination was prematurely timing out after 5 minutes when complex repositories required 20+ minutes of processing time. The issue was caused by hardcoded timeout caps in the frontend that overrode the backend's dynamic timeout calculations.
Key Changes Made
🔧 Frontend Timeout Logic Updates (
src/app/[owner]/[repo]/page.tsx
)🌍 Environment Variable Support
NEXT_PUBLIC_MAX_PROCESSING_TIMEOUT
- Maximum global processing timeout (default: 2 hours)NEXT_PUBLIC_MAX_PAGE_TIMEOUT
- Maximum per-page timeout (default: 15 minutes)NEXT_PUBLIC_DEFAULT_TIMEOUT
- Default when complexity analysis fails (default: 10 minutes)NEXT_PUBLIC_TIMEOUT_XLARGE
- Extra-large repository threshold (default: 30 minutes)📋 Documentation
TIMEOUT_FIX_DOCUMENTATION.md
) explaining the fixProblem Solved
Before: Backend calculates 20-minute timeout → Frontend caps at 5 minutes → Premature timeout error
After: Backend calculates 20-minute timeout → Frontend respects recommendation → Successful processing
Key Benefits
✅ Eliminates premature timeouts for complex repositories requiring extended processing
✅ Maintains performance for simple repositories with appropriate shorter timeouts
✅ Provides configurability through environment variables for different deployment needs
✅ Ensures safety with reasonable minimum and maximum bounds
✅ Backward compatible - works without any environment variables using sensible defaults
✅ Better debugging with timeout adjustment logging
Testing Verification
Technical Details
Timeout Flow
api/data_pipeline.py
analyzes repository complexity and recommends timeoutsConfiguration Hierarchy
Breaking Changes
None - this is a backward compatible fix that maintains existing behavior while enabling longer timeouts when needed.
Related Issues
Resolves timeout error: "Wiki structure determination timed out after 5 minutes. Please try again with a smaller repository or enable file filtering."