So many start up companies that I applied didn't have a proper CV processing pipeline which makes candidates un-aware of what happened to their application and sometimes recruiter doesn't even have a proper way to query these applications which makes this process tedious for both candidates and recruiters.
This aims to solve that issue by providing a configurable CV processing pipeline for companies that doesn't have time to create their own from scratch. So if you're a recruiter, you can set this up in your infrastructure (preferably AWS) in seconds and start managing those 1000+ CVs flowing to you like a boss.
Plus this is Open Source and licensed under MIT, So you can do whatever the customization you want on top of this and use.
Initially this was developed as an Internship assignment for Metana Intern Software Engineer position. Then fell in love with this idea and asked from them if I could use this and develop it into a configurable open source application which anyone can use.
- So, huge thank to Metana Company for allowing me to develop this into a FOSS product.
- Collects candidates' CVs from an intuitive frontend.
- Extracts relevant details (via RegEx).
- Appends data to a recruiter’s Google Sheet.
- Notifies the recruiter once processing is complete.
- Sends follow-up emails to candidates.
- Sending follow up emails in candidates' convenient timezone defined by the recruiters.
- Adding a method to keep track on the CV progress and send mails to candidates when there is an update.
- Adding AI based CV detials extraction as an option for the recuiters.
- Implement an MCP (Model-Context-Protocol) to bridge the result spreadsheet with an AI chatbot, so the recruiters can query the candidates CVs using human langauge instead of manually searching through the spreadsheet.
- Frontend - React (bootsrapped with vite)
- Backend - Python (Lambda function)
- Middleman - Python (Lambda function)
- Infrastructure - Amazon Web Services (AWS)
- Email Service - SendGrid, AWS SES
- Google sheets api - To write to google sheets
- CI/CD - AWS Codepipeline and Codebuild
- Infra-management - Terraform
- VCS - Git and GitHub
- Testing - Pytest
- AWS IAM user with following permissions
AmazonAPIGatewayAdministrator
AmazonS3FullAccess
AmazonSSMFullAccess
AWSCodeBuildAdminAccess
AWSCodePipeline_FullAccess
AWSLambda_FullAccess
CloudWatchLogsFullAccess
IAMFullAccess
CodeStarAccessCustom (refer the custom policy)
KMSAccessCustom (refer the custom policy)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ConnectionsFullAccess",
"Effect": "Allow",
"Action": [
"codeconnections:CreateConnection",
"codeconnections:DeleteConnection",
"codeconnections:UseConnection",
"codeconnections:GetConnection",
"codeconnections:ListConnections",
"codeconnections:ListInstallationTargets",
"codeconnections:GetInstallationUrl",
"codeconnections:StartOAuthHandshake",
"codeconnections:UpdateConnectionInstallation",
"codeconnections:GetIndividualAccessToken",
"codeconnections:TagResource",
"codeconnections:ListTagsForResource",
"codeconnections:UntagResource"
],
"Resource": "*"
}
]
}
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"kms:Encrypt",
"kms:Decrypt",
"kms:GenerateDataKey*",
"kms:DescribeKey",
"kms:CreateKey",
"kms:ListKeys",
"kms:CreateAlias"
],
"Resource": "*"
}
]
}
- Fork, clone and cd into the repo:
git clone "your-fork's-git-url"
cd <your-fork's-name>
- zip backend and middleman codes:
cd backend
mkdir package
uv pip install --target=package -r requirements.txt
cp -r lambda_function.py models/ utils/ package/
cd package
zip -r ..//lambda_function.zip && cd ../..
cd middleman
mkdir package
uv pip install --target=package -r requirements.txt
cp -r get_presigned_url.py models/ utils/ package/
cd package
zip -r ..//lambda_get_presigned_url.zip && cd ..
- Go to terraform dir and create the variables.tfvars file as follows:
cd ..
cd terraform
# terraform.tfvars
aws_region = "preffered-region"
github_owner = "github-user-name-of-the-git-repo"
github_repo = "fork's-name"
github_token = "gh-PAT-token"
github_webhook_secret = "put-a-unique-password-here"
- Init terraform:
terraform init
- Deploy the infra:
terraform validate
terraform plan --out="tfplan"
terraform apply "tfplan"
-
Go to AWS Codepipeline console > settings > connections and verify the codestar connection.
-
Push the code to online git repo:
git push -u origin main
This is an Open Source Project licensed under MIT License and any contribution is warmly welcome.