This project is an intelligent AI-powered Insurance Claims Assistant that streamlines the claim handling process through an automated agentic workflow. The system processes accident images uploaded by customers, generates AI-based damage descriptions, and triggers a sophisticated backend agent that retrieves relevant insurance policies using vector search to provide actionable claim recommendations to handlers.
The application follows a comprehensive agentic workflow that bridges structured and unstructured data:
- Customer Input: Customers upload damage photos through a web interface
- AI Image Analysis: Advanced AI models analyze and describe accident damage in real-time
- Intelligent Agent Processing: LangGraph-powered agent processes the description through multiple decision points
- Vector-Based Policy Retrieval: Agent uses semantic search to find relevant insurance guidelines
- Automated Claim Processing: Agent creates comprehensive claim summaries with structured recommendations
- Handler Assignment: System assigns claims to appropriate handlers with complete documentation
This project leverages MongoDB Atlas Vector Search to efficiently handle the complete insurance workflow, providing fast and relevant retrieval of information. MongoDB Atlas offers robust and scalable database solutions, making it ideal for handling large volumes of data and complex queries.
-
Unified Data Platform
Seamlessly handles both structured claim data (customer info, policy details) and unstructured data (damage photos, policy documents, accident reports) in a single database. No more data silos or complex ETL processes. -
Atlas Vector Search
Powers semantic similarity search using advanced embeddings to find relevant insurance policies based on accident descriptions, enabling intelligent claim routing and policy recommendations with unprecedented accuracy. -
Flexible Schema Evolution
Perfect for evolving agentic workflows where claim structures, agent tools, and processing steps continuously adapt to new regulations, products, and customer needs without database migrations. -
Flexibility—Multi-modal Data Storage
MongoDB provides unmatched flexibility in supporting multi-modal data. It efficiently stores and indexes diverse data types, including structured claim records, document-like policies, unstructured accident images, vector embeddings, and even time-series sensor data. This enables seamless workflows where all your data—optimized for AI/ML processing and real-time analytics—resides in a single, unified platform.
- Guideline Retrieval: Vector-based search for relevant insurance guidelines based on accident descriptions
- Persistent State Management: Store claim data, chat history, and agent states in MongoDB with full audit trails
- Vector-Powered Policy Retrieval: Semantic search through insurance guidelines using Cohere embeddings with cosine similarity
- Flexible Data Storage: MongoDB's document structure handles dynamic claim data and evolving workflows
- FastAPI: Modern, high-performance web framework for building APIs
- Uvicorn: ASGI web server for running FastAPI applications
- Python 3.10: Core programming language (>=3.10,<3.11)
- LangChain: Framework for developing applications with language models
- LangGraph: Library for building stateful, multi-actor agentic applications
- AWS Bedrock: Managed service for foundation models
- Claude 3 Haiku: anthropic.claude-3-haiku-20240307-v1:0 – Fast agent orchestration and reasoning
- Claude 3 Sonnet: anthropic.claude-3-sonnet-20240229-v1:0 – Advanced multi-modal image analysis
- Cohere English V3: cohere.embed-english-v3 – Text embeddings for vector search
- MongoDB Atlas: Cloud-native document database with vector search capabilities
- MongoDB Atlas Vector Search: Semantic similarity search for policy retrieval
- PyMongo: Python driver for MongoDB operations
- LangGraph MongoDB Checkpoint: Agent state persistence and workflow tracking
- Next.js: React framework with server-side rendering
- React: Frontend JavaScript library
- CSS Modules: Scoped component styling
- Docker: Container platform for consistent deployments
- Docker Compose: Multi-container application orchestration
- Poetry: Python dependency management and packaging
- Make: Build automation and deployment commands
- Log in to MongoDB Atlas and create a new database named
insurance_claims
- Create the following collections:
processed_claims
– For storing final claim summarieschat_history
– For agent conversation persistencepolicy_documents
– For insurance guidelines and policies (with vector embeddings)
- Set up MongoDB Vector Search Index for the
policy_documents
collection:
{
"fields": [
{
"type": "vector",
"path": "descriptionEmbedding",
"numDimensions": 1024,
"similarity": "cosine"
}
]
}
- Create an AWS account if you don't have one
- Add the AWS Access Key ID and Secret Access Key to your environment variables
- Grant the necessary permissions to the AWS account:
AmazonBedrockFullAccess
- Ensure the required Bedrock models are available in your region:
anthropic.claude-3-haiku-20240307-v1:0
(for agent orchestration)anthropic.claude-3-sonnet-20240229-v1:0
(for image analysis)cohere.embed-english-v3
(for text embeddings)
- Docker Desktop installed and running on your machine
make
installed (sudo apt install make
on Ubuntu orbrew install make
on macOS)
Clone the repository:
git clone <repo-url>
cd insurance-claim-agent
Create a .env
file in the root directory:
# AWS Configuration
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
AWS_DEFAULT_REGION=us-east-1
# MongoDB Configuration
MONGODB_URI=mongodb+srv://<username>:<password>@<cluster-name>.mongodb.net/
DATABASE_NAME=insurance_claims
COLLECTION_NAME=policy_documents
COLLECTION_NAME_2=processed_claims
CHAT_HISTORY_COLLECTION=chat_history
# Bedrock Configuration
BEDROCK_REGION=us-east-1
# Frontend Configuration
NEXT_PUBLIC_IMAGE_DESCRIPTOR_API_URL=http://localhost:8000/imageDescriptor
NEXT_PUBLIC_RUN_AGENT_API_URL=http://localhost:8000/runAgent
make build
- Frontend UI: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
Start services (if already built):
make start
Stop all services:
make stop
View logs:
docker-compose logs -f
Clean up containers and images:
make clean
Clone the repository:
git clone <repo-url>
cd insurance-claim-agent/backend
Install Poetry (if not already installed):
make install_poetry
Configure Poetry and install dependencies:
poetry install
Create a .env
file in the backend
directory:
# AWS Configuration
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
AWS_DEFAULT_REGION=us-east-1
# MongoDB Configuration
MONGODB_URI=mongodb+srv://<username>:<password>@<cluster-name>.mongodb.net/
DATABASE_NAME=insurance_claims
COLLECTION_NAME=policy_documents
COLLECTION_NAME_2=processed_claims
CHAT_HISTORY_COLLECTION=chat_history
# Bedrock Configuration
BEDROCK_REGION=us-east-1
Start the backend server.
Open a new terminal and navigate to frontend
:
cd ../frontend # or 'cd frontend' if starting from project root
Create a .env.local
file in the frontend
directory:
NEXT_PUBLIC_IMAGE_DESCRIPTOR_API_URL=http://localhost:8000/imageDescriptor
NEXT_PUBLIC_RUN_AGENT_API_URL=http://localhost:8000/runAgent
Install dependencies:
npm install
Start the frontend development server:
npm run dev
- Frontend UI: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
If you get package installation errors, use:
poetry install --no-root
The provided Makefile streamlines common setup and development tasks. If you encounter issues with setting up Poetry, Docker containers, or running the application, try using the provided make commands, such as:
make install_poetry
– Installs Poetry for dependency managementmake build
– Builds Docker images for backend and frontendmake start
/make stop
– Starts or stops the full stack using Docker Composemake clean
– Cleans up containers and images to resolve potential environment conflicts
Refer to the Makefile itself or run make help
for a full list and description of available commands.
- Ensure Docker Desktop is running before using Make commands
- Check that AWS credentials are properly mounted in containers
- Verify that ports
3000
and8000
are not in use by other applications
- Ensure your AWS credentials have Bedrock permissions for all three models
- Verify Claude and Cohere models are available in your specified region
- Check that your AWS account has been granted access to required models
- Verify your MongoDB URI format and network connectivity
- Ensure all required collections (
processed_claims
,chat_history
,policy_documents
) exist - Check that your Vector Search index is properly configured with correct field names
- Ensure
.env
files are in the correct directories - Verify all required environment variables are set
- Check that sensitive values are properly configured (no placeholder text)