Warning: This project is in active development and in a very early stage. Breaking changes may occur at any time.
Conversations is an open-source AI chatbot designed to be simple, secure and privacy-friendly.
Why another AI chatbot? Because we want to be able to fully control our data and the way we interact with AI. We want to have a very friendly end-user interface and code, and we want to be able to easily customize the chatbot to our needs.
We leverage open-source projects such as Vercelβs AI SDK and Pydantic AI and only assemble them in a way that makes sense for us and allows us to focus on the product.
This assistant's purpose is also to be integrated into the "La Suite numΓ©rique" ecosystem of tools for public services.
Any help to improve the project is very welcome!
π Conversations is easy to install on your own servers
Available methods: Helm chart, soon Nix package
In the works: Docker Compose, soon YunoHost
You can test Conversations on your browser by visiting this => TBD
β οΈ The methods described below for running Conversations locally is for testing purposes only.
Prerequisite
Make sure you have a recent version of Docker and Docker Compose installed on your laptop, then type:
$ docker -v
Docker version 20.10.2, build 2291f61
$ docker compose version
Docker Compose version v2.32.4
β οΈ You may need to run the following commands withsudo
, but this can be avoided by adding your user to the localdocker
group.
Project bootstrap
The easiest way to start working on the project is to use GNU Make:
$ make bootstrap FLUSH_ARGS='--no-input'
This command builds the app-dev
and frontend-dev
containers, installs dependencies, performs database migrations and compiles translations. It's a good idea to use this command each time you are pulling code from the project repository to avoid dependency-related or migration-related issues.
Your Docker services should now be up and running π
You can access the project by going to http://localhost:3000.
You will be prompted to log in. The default credentials are:
username: conversations
password: conversations
π Note that if you need to run them afterwards, you can use the eponymous Make rule:
$ make run
To do so, install the frontend dependencies with the following command:
$ make frontend-development-install
And run the frontend locally in development mode with the following command:
$ make run-frontend-development
To start all the services, except the frontend container, you can use the following command:
$ make run-backend
Adding content
You can create a basic demo site by running this command:
$ make demo
Finally, you can check all available Make rules using this command:
$ make help
Django admin
You can access the Django admin site at:
You first need to create a superuser account:
$ make superuser
This work is released under the MIT License (see LICENSE).
While Conversations is a public-driven initiative, our licence choice is an invitation for private sector actors to use, sell and contribute to the project.
You can help us with translations on Crowdin.
If you intend to make pull requests, see CONTRIBUTING for guidelines.
docs
βββ bin - executable scripts or binaries that are used for various tasks, such as setup scripts, utility scripts, or custom commands.
βββ crowdin - for crowdin translations, a tool or service that helps manage translations for the project.
βββ docker - Dockerfiles and related configuration files used to build Docker images for the project. These images can be used for development, testing, or production environments.
βββ docs - documentation for the project, including user guides, API documentation, and other helpful resources.
βββ env.d/development - environment-specific configuration files for the development environment. These files might include environment variables, configuration settings, or other setup files needed for development.
βββ gitlint - configuration files for `gitlint`, a tool that enforces commit message guidelines to ensure consistency and quality in commit messages.
βββ src - main source code directory, containing the core application code, libraries, and modules of the project.
Conversations is built on top of Django Rest Framework, Next.js, Vercelβs AI SDK and Pydantic AI. We thank the contributors of all these projects for their awesome work!