An open-source, multi-model AI chat playground built with Next.js App Router. Switch between providers and models, compare outputs side-by-side, and use optional web search and image attachments.
- Multiple providers: Gemini, OpenRouter (DeepSeek R1, Llama 3.3, Qwen, Mistral, Moonshot, Reka, Sarvam, etc.)
- Selectable model catalog: choose up to 5 models to run
- Web search toggle per message
- Image attachment support (Gemini)
- Clean UI: keyboard submit, streaming-friendly API normalization
- Next.js 14 (App Router, TypeScript)
- Tailwind CSS
- API routes for provider calls
- Install deps
npm i
- Configure environment
Create
.env.local
with the keys you plan to use:
# OpenRouter (recommended for most free models)
OPENROUTER_API_KEY=...
# Gemini (for Gemini models and image input)
GOOGLE_GENERATIVE_AI_API_KEY=...
- Run dev server
npm run dev
# open http://localhost:3000
OPENROUTER_API_KEY
: API key from https://openrouter.ai (required for OpenRouter models)GOOGLE_GENERATIVE_AI_API_KEY
: API key from Google AI Studio (required for Gemini models)
You can also provide an API key at runtime in the UI’s Settings panel.
app/
– UI and API routesapi/openrouter/route.ts
– normalizes responses across OpenRouter models; strips reasoning, cleans up DeepSeek R1 to plain textapi/gemini/route.ts
,api/gemini-pro/route.ts
components/
– UI components (chat box, model selector, etc.)lib/
– model catalog and client helpers
Open-Fiesta post-processes DeepSeek R1 outputs to remove reasoning tags and convert Markdown to plain text for readability while preserving content.
This project is licensed under the MIT License. See LICENSE
for details.
- Model access via OpenRouter and Google