Leveraged MCP and OpenAPI to implement a multi-agent job application AI assistant
This project is an agentic workflow orchestration system built with the Model Context Protocol (MCP) on FastAPI. It automates job application processing using ChatGPT for job description summarization and OpenAI embeddings for semantic similarity scoring. It also manages email notifications with the Resend API and stores data and error logs in SQLite.
- Resume Extraction Extracts text from PDF and DOCX files using PyMuPDF and python-docx
- Job Description Summarization Generates concise summaries using ChatGPT (
gpt-3.5-turbo
) - Semantic Scoring Computes cosine similarity using OpenAI Embeddings (
text-embedding-ada-002
) and numpy - Email Automation Sends interview invitations via the Resend API for candidates above threshold
- Duplicate Detection Prevents redundant processing by checking existing applications by email, resume, and job description
- Resume Validation Validates documents as resumes using ChatGPT-based analysis
- Error Logging Records errors in a SQLite database for auditing
- Custom MCP Server Exposes tools as reusable endpoints under
/mcp
- Responsive Web UI Modern Bootstrap interface for easy form submission
- FastAPI serves as the HTTP server
- MCP extends FastAPI with agentic tools under
/mcp
- Database layer uses SQLAlchemy with SQLite for storage
- File handling saves uploaded resumes in an
uploads
directory
Component | Technology | Purpose |
---|---|---|
Framework | FastAPI | API server |
MCP | fastapi_mcp (stub) | Agent orchestration |
Database | SQLAlchemy, SQLite | Persistent storage |
NLP | openai | ChatGPT summarization and embeddings |
resend | Notification delivery | |
File Processing | PyMuPDF, python-docx | Resume text extraction |
Utilities | numpy | Numerical operations for similarity scoring |
- fastapi
- uvicorn
- python-dotenv
- openai >= 1.0
- resend
- sqlalchemy
- PyMuPDF
- python-docx
- python-multipart
- numpy
- fastapi_mcp (included)
- Obtain API keys for the external services:
- Copy
.env.example
to.env
and replace the placeholder values with your own keys. The.env
file is ignored by Git so your keys remain private. - Create and activate a virtual environment (recommended):
python -m venv venv source venv/bin/activate
- Install the required Python packages using your environment's Python:
You can also run
python -m pip install -r requirement.txt
./setup.sh
, which installs the same dependencies. If you seeModuleNotFoundError
errors when running the app, rerun the above command inside the same virtual environment to ensure all dependencies are installed. - Ensure the
uploads
directory is writable. - Run
python main.py
to start the API server (listens onhttp://127.0.0.1:8000
). - Open
http://localhost:8000
in your browser to access the Bootstrap-based web UI and API.
Processes a job application by analyzing the resume and job description
- Parameters file (PDF or DOCX) job_description_text
- Response JSON with email, score, email_status, message
Tools are available under /mcp
for individual testing:
- extract_resume_text
- summarize_job_description
- score_similarity
- extract_email_address
- send_email_notification
- invite_for_interview
- find_existing_application
- validate_resume_document
- applications id, email, resume_text, job_description, score, email_status, created_at
- error_logs id, error_message, created_at
- Use
OPENAI_API_KEY
for OpenAI access - Customize the
OPENAI_MODEL
environment variable to switch chat model. The sample.env.example
usesgpt-4.0
as the default model. - The
fastapi_mcp
module is bundled with the repository as a lightweight stub, so there is no externalfastapi-mcp
package to install. - This project requires the
openai
Python library version 1.0 or newer. If your environment has an older version installed, upgrade withpip install -U openai
.