The technologies used are:
This project consists of two parts:
- a program
/src/pdftovector
that indexes PDF files under/data
to create context for the chatbot - a program
/src/replytouser
that takes a question, recontextualizes it based on conversation history saved in Redis, and generates a response from the context
The project is designed to run both autonomously and as a microservice within Lambda or Fargate, so it's already designed to accept input parameters such as consent acceptance, session and user ID.
Console output includes benchmarks to monitor the time the chatbot takes to respond.
In the root directory you'll find:
.nvmrc
NodeJS version.eslint.config.js
linter configuration.prettierrc
formatter configuration.husky/
contains git hooksdocker-compose.yaml
container for local developmentpackage.json
dependencies and development scriptssrc/
all project microservicesdata/
directory containing PDF files to be indexed
- Run
nvm use
or manually set NodeJS to the version contained in the.nvmrc
file - Install development dependencies in the root and for each service install layer dependencies with
npm install
- Populate the .env file
- Open
src/index.js
and populate thegenerateResponse
function event with a question - Run everything with
npm run dev
Redis and OpenSearch are available locally as containers, the docker-compose can be managed with npm run docker:start
and npm run docker:stop
.
eslint and prettier are used to perform linting and format code in an opinionated manner.
Actions can be executed independently with npm run lint
and npm run format
from the root.
Additionally, through Git Hooks and husky, formatting occurs before every commit and linting before every push. During this phase only staged files are considered, through lint-staged.
The included test runner is vitest, compatible with Jest.
Tests are run from the root with npm run test
.