A C++ framework for building and running AI agents. This project aims to provide a modern, robust, and type-safe environment for AI agent development in C++.
This project provides a simple yet powerful framework for creating agents in modern C++20.
A key feature is its custom-built, type-safe client for the OpenAI API, which serves as a foundation for communicating with language models.
Any LLM provider that supports OpenAI API, such as Anthropic (claude) or Google (gemini) can be used with this library.
The repository is structured as a monorepo to manage multiple libraries and applications.
.
├── apps/
│ └── example_runner/ # An example executable demonstrating the framework
├── build/ # Build output directory (not in git)
├── cmake/ # Custom CMake helper scripts
├── libs/
│ ├── core/ # Core utilities (logging, env management)
│ ├── agents/ # Core agent abstractions and implementations
│ └── providers/ # Connectors to external services (e.g., OpenAI)
│ └── include/providers/
│ ├── base/ # Abstract provider interfaces
│ └── openai/ # Type-safe OpenAI client implementation
├── tests/ # Unit and integration tests
├── .env.example # Example environment file
├── .gitignore
├── CMakeLists.txt # Root CMake file
├── CMakePresets.json # Defines standard build configurations
└── vcpkg.json # vcpkg dependency manifest
- A C++20 compatible compiler (e.g., GCC, Clang, MSVC)
- CMake (version 3.20 or higher)
- An OpenAI API key
-
Clone the Repository and Submodules:
git clone --recurse-submodules <your-repo-url> cd agents-cpp
-
Set up the Environment: Copy the example
.env
file and add your OpenAI API key.cp .env.example .env # Now, edit .env and add your OPENAI_API_KEY
-
Configure the CMake Project using Presets: This command generates the build files in
build/debug
and tells CMake to usevcpkg
to automatically install dependencies. Using the preset handles the toolchain configuration automatically.cmake --preset debug
-
Build the Project: This command compiles all the libraries and executables.
cmake --build build/debug
After a successful build, you can run the example application:
./build/debug/apps/example_runner/example_runner
You should see a joke from the OpenAI API in the output as a stream.