Skip to content

phospho-app/text-analytics-legacy

Repository files navigation

The LLM app backoffice for busy builders

phospho logo phospho npm package phospho Python package on PyPi Y Combinator W24

⚠️ This project was discontinued … see → phosphobot ⚠️

The backoffice for your LLM app.

Detect issues and extract insights from your users' text messages.

Gather feedback and measure success. Create the best conversational experience for your users. Analyze logs effortlessly and finally make sense of all your data.

Learn more in the documentation.

platform

Demo 🧪

phospho_demo_socials.mp4

Key Features 🚀

  • Clustering: Group similar conversations and identify patterns
  • A/B Testing: Compare different versions of your LLM app
  • Data Labeling: Efficiently categorize and annotate your data
  • User Analytics: Gain insights into user behavior and preferences
  • Integration: Sync data with LangSmith/Langfuse, Argilla, PowerBI
  • Data Visualization: Powerful tools to understand your data
  • Multi-user Experience: Collaborate with your team seamlessly

Deploy with docker compose

Create a .env.docker using this guide. Then, run:

docker compose up

Go to localhost:3000 to see the platform frontend. The backend documentation is available at localhost:8000/v3/docs.

Development guide

Contributing

We welcome contributions from the community. Please refer to our contributing guidelines for more information.

Running locally

This project uses Python3.11+ and NextJS.

To work on it locally,

  1. Make sure you have properly added .env files in ai-hub, extractor, backend, platform.
  2. Install the Temporal CLI brew install temporal
  3. Create a python virtual environment.
python -m venv .venv
source .venv/bin/activate
  1. Then, the quickest way to get started is to use the makefile to install and up.
# Install dependencies
make install
# Launch everything
make up
  1. Go to localhost:3000 to see the platform frontend. The backend documentations are available at localhost:8000/api/docs, localhost:8000/v2/docs and localhost:8000/v3/docs.

  2. To stop everything, run:

make stop

Related projects

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details

About us

We are a team of passionate AI builders, feel free to reach out here. With love and baguettes from Paris 🥖💚

Star History

Star History Chart