Vercel AI Provider for running LLMs locally using Ollama
-
Updated
Jul 10, 2025 - TypeScript
Vercel AI Provider for running LLMs locally using Ollama
Add a description, image, and links to the vercel-ai-sk topic page so that developers can more easily learn about it.
To associate your repository with the vercel-ai-sk topic, visit your repo's landing page and select "manage topics."