This starter kit integrates the Vercel AI SDK UI with a Laravel backend powered by the Prism package, providing a solid foundation for building AI-driven chat applicaions in a familiar Laravel ecosystem.
- ✅ Persistent Conversations
- ✅ Tool Calling
- ✅ Vercel AI SDK Stream Protocol
- ✅ Generative User Interfaces
- ✅ Prism Integration
- ✅ Built On Laravel React Starter Kit
-
Clone the repository and navigate to the project directory:
git clone https://github.com/benbjurstrom/chat.git && cd chat
-
Install Composer and NPM dependencies:
composer install && npm install
-
Copy the .env.example file and configure environment variables:
cp .env.example .env && php artisan key:generate
Edit the
.env
file to include your database credentials, LLM API keys, and desired LLM settings. For example, to use OpenAI:PRISM_PROVIDER=openai PRISM_MODEL=gpt-4o OPENAI_API_KEY=YOUR_OPENAI_API_KEY
or to use Gemini:
PRISM_PROVIDER=gemini PRISM_MODEL=gemini-2.0-flash GEMINI_API_KEY=YOUR_GEMINI_API_KEY
or to use Ollama:
PRISM_PROVIDER=ollama PRISM_MODEL=llama3.1
-
Run database migrations:
php artisan migrate --seed
-
Start the development server:
composer run dev
- This command should launch both the Laravel server and the Vite development server.
After starting the servers, visit http://localhost:8000
and login using the default credentials to access the chat application.
- Email:
test@example.com
- Password:
password
This project is licensed under the MIT License. See the LICENSE file for more information.