MCP SSE server that allows you to do Virtual Try-On through LLM Chat using a back-end db hosted locally or through the cloud (Couchbase) and Replicate APIs.
The MCP server uses the following services / libraries.
- cpp-mcp: C++ MCP Client and Server library.
- Eigen C++ library: C++ mathematical linear algebra library.
- Ollama: C++ bindings for Ollama API.
- [Couchbase] (optional)
- The .CSV file in the
db
folder (required if not using Couchbase) - CSV files alongside embeddings uploaded to Couchbase (local) or Couchbase Capella (Cloud). Works on free tier!
- Ollama
- ... coming soon...
Coming soon
I already built 2 MCP servers in Python before and the dependencies and Docker image sizes were too huge. Yes, I could have used an easier example like GoLang but I thought why not!
This project taught me how much I take Python abstractions for granted. π C/C++ humbles you that way.
And yes, doing it in C++ was advantageous.
A basic example using Couchbase (Cloud) services.
output.mp4
An example of a regressive inference using the MCP server with Couchbase (Cloud) services. Regressive here means you can use previous output as the base image to virtual transfer on, e.g. T-Shirt Try On -> Pant Try On using T-Shirt Try On Image.
output.mp4
Because why not? π Using local CSV vector search.
output.mp4
Always welcome to contribute.
This project is licensed under the MIT License.