Branch | Status |
---|---|
Main | |
Backend-dev |
- 1. Introduction
- 2. System Description
- 3. UML Diagrams
- 4. Backlog
- 5. Branch Name Template
- 6. Commit Template
- 7. Getting Started
- 8. API Reference (OpenAPI)
- Our customers are HR personnel.
- Our users are candidates who apply for jobs.
- Filter resumes to identify those qualified by job requirements or other criteria.
- Conduct simple automated conversations with candidates and score them.
- Show the scoreboard to HR personnel.
- We simulate an IT company's career website.
- Candidates upload their resumes to the website.
- The backend uses an LLM to filter resumes against the requirements of the target job.
- After filtering, the AI asks candidates for basic information about their resumes or related technologies (this may involve combining tech documents from a vector database). For example, it might ask questions about their understanding of the technologies listed in the job requirements or discuss projects/experiences detailed in their resume.
- The AI scores the candidates.
- HR personnel can see the list of scores on a console page.
- AI filters resumes based on job requirements.
- AI scores candidates based on their resumes and chat history.
This system is designed with a layered architecture, consisting of the following layers:
- UI Layer
- Tech: React
- Components: Candidate Portal, HR Dashboard
- Application Services Layer
- Tech: Spring Boot
- Components: API Gateway, Job Management Service, Application Management Service, LangChain GenAI Service
- Data Storage Layer
- Tech: PostgreSQL + pgvector
- Details: PostgreSQL serves as the relational database for structured data, while pgvector provides vector database capabilities for RAG.
- External Services Layer
- Details: OpenAI API for LLM capabilities (specific model TBD).
The backend is implemented using Spring Boot, exposing RESTful APIs to support both candidate and HR operations. It handles business logic, including:
- Job posting and management (via
Job Management Service
). - Candidate application processing, resume filtering, interview generation, and scoring (via
Candidate Application Service
). - Integration with external services, such as OpenAI, for GenAI tasks (via
LangChain GenAI Service
).
The API Gateway
serves as the single entry point, routing requests from UI clients to the appropriate services.
The system has two main user interfaces developed in React:
- Candidate Portal: Allows candidates to upload resumes and complete AI-based interviews.
- HR Dashboard: Enables HR users to post jobs, view filtered resumes, see generated interview questions, and review ranked candidate lists.
All frontend components communicate with the backend via REST APIs.
A separate microservice is developed in Python, using LangChain to orchestrate LLM tasks. This
LangChain GenAI Service
is responsible for:
- Filtering resumes against job requirements and scoring them.
- Generating interview questions based on job requirements, candidate resumes, and documents in the vector database.
- Analyzing chat history and scoring candidates against job requirements.
It communicates with the main Spring Boot backend (Application Services Layer) and uses the OpenAI API for underlying LLM capabilities.
The system utilizes a PostgreSQL database extended with the pgvector extension to support semantic search and vector-based operations. It stores:
- User data (candidates, HR personnel)
- Job postings and requirements
- Candidate resumes and applications
- Assessment scores and chat histories
- Vector embeddings for technical documents (for RAG)
This design allows the system to store structured HR data alongside high-dimensional AI data used for generating questions, scoring, and analysis.
- As an HR user, I want to add a new job requirement so that candidates can apply for it.
- As an HR user, I want to close a job requirement so that candidates can no longer apply for it.
- As a candidate, I want to upload my resume so that I can apply for a job.
- As an HR user, I want the AI to filter resumes so that I can find qualified candidates.
- As an HR user, I want the AI to generate questions to ask candidates so that I can better understand their qualifications.
- As a candidate, I want to answer the AI's questions about my resume or projects so that I can better present my strengths.
- As an HR user, I want the AI to score candidates based on their resumes and chat history so that I can find the best candidates.
- As an HR user, I want to view a ranked list of candidate scores so that I can quickly identify the best candidates.
week<digit>-<task>
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
For example:
git commit -m "feat: add a new feature" \
-m "add a new feature to the project" \
-m "closes: #1234"
Commit Types:
- feat: A new feature
- fix: A bug fix
- docs: Documentation only changes
- ci: Changes to CI configuration files and scripts
- test: Adding missing tests or correcting existing tests
- refactor: Refactoring code without changing business logic (e.g., changing variable names, structures, code style)
- perf: Optimizing performance by improving code logic
The following subsections outline the key environment files and automation scripts used during local development, image build, and deployment.
Copy example.env
to .env
and customise environment variables such as database credentials, JWT keys, and VITE_API_BASE_URL
. All services load .env
automatically.
cp example.env .env
Start PostgreSQL + pgvector only – perfect for frontend/backend development when other micro-services are not required.
docker compose -f docker-compose-db.yml up -d
Launch all backend + frontend services for a complete local stack.
docker compose up --build -d
Build production-grade images only (does not run them). Mainly used in CI pipelines.
docker compose -f docker-compose.prod.yml build
Run the images built in step 7.4. Useful for staging or on-premise deployments.
docker compose -f docker-compose.prod.deploy.yml up -d
Infrastructure-as-Code – provision cloud resources (VPC, RDS, EKS, etc.) with Terraform.
./terraform/terraform.sh apply
Use Ansible to install dependencies, distribute configuration, and deploy services to the provisioned hosts.
./ansible/ansible.sh
One-click installation of a lightweight Kubernetes cluster (k3s / micro-k8s) for testing. Contains commands for both development and production environments.
./k8s-install.sh
All scripts are executable. If you encounter permission issues, run
chmod +x <script>
.
An offline copy of the API specification lives in api-openapi-firefox-online.md
.
The file was generated automatically by Apifox after the project was completed and can be fed to an LLM to help it
understand the entire API surface.
- Online interactive documentation / playground: https://ifoh7semfe.apifox.cn/
- The spec follows OpenAPI 3.0 and can be imported into Postman, Stoplight, Swagger-UI, etc.