RenAI is a platform aiming to automatize development of software projects. It centers around an LLM agent that iteratively builds a codebase according to user specificatios, interacting with a dedicated environment and outside sources.
- Full Stack Web Application Building: The LLM agent develops both backend and frontend projects in a variety of modern frameworks (eg. Spring Boot, Angular).
- App Specification: Users can provide detailed specifications for their desired application, including technologies, coding styles and code templates, or rely on the agent to make choices.
- Agent Lifecycle Management: RenAI provides an intuitive interface for managing the agent's workflow and interacting with its environment.
- Configurable Infrastructure: Users can run the agent on common infrastructure to minimize costs or set up an isolated environment for security.
- Testing: The agent regularly writes tests and runs them to ensure code quality.
RenAI is not yet in production, but you can run the code locally following these steps:
- Ensure you have Docker, Minikube, KubeCTL, Helm and Java JDK 21. You also need a newer version of Python and at least 6GB of free RAM for the LLM model inference.
- Fork and fetch the RenAI backend repository and open a terminal in its root.
- You need to build a Docker image for each of the core microservices:
renai-core
,renai-developer
,renai-gateway
andrenai-llm-inference
. For instance, forrenai-core
, you need to:
- navigate to the microservice root:
cd services/renai-core
- Build the Java project:
mvn clean package -DskipTests
- Build the image:
eval $(minikube docker-env)
anddocker build -t renai-core:latest .
Therenai-llm-inference
service is a Python project so you can skip the second step.
- Start Minikube and install Kubernetes deployments with Helm:
helm install std-release .
. Scale each service as needed by modifying thereplicaCount
inkubernetes/dev/helm/values.yaml
. - To interact with the system, port-forward the API gateway, for instance
kubectl port-forward service/std-release-renai-gateway 8080:8080
. You can now make calls tolocalhost:8080
- Fetch the Web Frontend repository and run
ng serve
. You can now access the app atlocalhost:4200
and run your RenAI developer!
In mid stages of development.
All contributions are warmly welcomed. Head over to CONTRIBUTING.md for details.