Skip to content

ChunkBack lets you test your apps with LLM provider endpoints without having to pay for the providers

License

Notifications You must be signed in to change notification settings

4shub/chunkback

Repository files navigation

License: MIT

License: MIT TypeScript Node.js PRs Welcome GitHub Actions Workflow Status NPM Version

ChunkBack lets you test your apps with LLM provider endpoints without having to pay for the providers.

ChunkBack is a simple express server that emulates the response input and output of popular LLM providers, currently Gemini, Anthropic and OpenAI. Chunkback accepts a custom prompt language called CBPL that lets you customize the response to your applications.

Quick Start

npx chunkback@latest

Then run in your terminal:

curl -X POST http://localhost:5654/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "swag",
    "messages": [
      {"role": "user", "content": "SAY \"First message\"\nCHUNKSIZE 3\nCHUNKLATENCY 50\nSAY \"Second message\""}
    ]
  }'

Why use ChunkBack?

  • Deterministic API response - You always will get back the same content you put in
  • Saves money - When testing your application, you can stub out your LLM calls with this
  • Open Source - You can see the code right there!
  • No extra services dependencies - There's no DB no Redis no nothing, just server code

Contributing

See CONTRIBUTING.md for:

License

MIT License - see LICENSE for details.

AI Notice

This project was built with the assistance of autogenerated coding libraries - most code was reviewed by a human (painstakingly). This README.md was handwritten, and should be kep that way.

About

ChunkBack lets you test your apps with LLM provider endpoints without having to pay for the providers

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •