The Intel Support Lens Dashboard is a comprehensive knowledge base solution designed for internal support teams. It combines semantic search technology with advanced LLM capabilities to provide instant, accurate answers to support questions based on your organization's documentation.
This application helps support agents quickly find information and respond to customer inquiries without having to manually search through extensive documentation. The system keeps track of documents, provides analytics on usage patterns, and continuously improves through interaction data.
- AI-powered Question Answering: Get contextually relevant answers based on support documentation
- Semantic Document Search: Find documents by meaning, not just keywords
- Document Management: Upload and manage support documentation
- Analytics Dashboard: Track system usage and performance metrics
- Citation Tracking: All responses include source citations for verification
- Multi-format Support: Handle Markdown, CSV, and plain text documents
The application is built with a modern stack:
- Backend: FastAPI + PostgreSQL (with pgvector) + LlamaIndex + Google Gemini
- Frontend: React + TypeScript + TailwindCSS + shadcn/ui
- Python 3.9+
- Node.js 18+
- PostgreSQL with pgvector extension
- Google API key for Gemini
- Clone this repository
git clone https://github.com/gurveervirk/intel-support-lens-dashboard.git
cd intel-support-lens-dashboard
- Set up backend environment
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
- Create a
.env
file in the backend directory with the following variables:
GOOGLE_API_KEY=your_google_api_key
CONNECTION_STRING=postgresql://username:password@localhost:5432
DB_NAME=intel_support_lens
TEMP_DIR=./tmp
MONGO_URI=mongodb://localhost:27017/ # modify for your MongoDB instance
- Set up the frontend environment
cd ../frontend
npm install
- Start the backend server
cd backend
python -m uvicorn app:app --reload
- In a separate terminal, start the frontend development server
cd frontend
npm run dev
- Access the application at http://localhost:8080
- Use the document upload feature to add your initial support documentation
- The system will process and index the documents automatically
- Start using the chat interface to query your knowledge base
For production deployment:
- Use a production WSGI server like Gunicorn
- Set up proper database connection pooling
- Configure CORS settings appropriately
- Implement authentication and authorization
- Consider containerizing the application with Docker
The application has been load tested with Locust and can handle multiple simultaneous users with reasonable response times. For large document collections, consider scaling the database and optimizing vector search parameters.
- Integration with ticketing systems
- Support for more document formats
- Advanced analytics and reporting
- User feedback collection for response quality
- Fine-tuning capabilities for domain-specific terminology