Skip to content

nordwestt/ollama-ai-provider-v2

Repository files navigation

Ollama Provider V2 for the Vercel AI SDK

The Ollama Provider V2 for the AI SDK has been created as the original ollama-ai-provider was not being actively maintained.

This provider now supports:

  • tool streaming and calling for models
  • enable/disable thinking

Setup

The Ollama provider is available in the ollama-ai-provider-v2 module. You can install it with

npm i ollama-ai-provider-v2

Provider Instance

You can import the default provider instance ollama from ollama-ai-provider-v2:

import { ollama } from 'ollama-ai-provider-v2';

Example

import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('llama3.2:latest'),
  prompt: 'Write a meaty lasagna recipe for 4 people.',
});

Thinking mode toggle example

import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('qwen3:4b'),
  providerOptions: { ollama: { think: true } },
  prompt: 'Write a meaty lasagna recipe for 4 people, but really think about it',
});

Documentation

Please check out the Ollama provider documentation for more information.