The Ollama Provider V2 for the AI SDK has been created as the original ollama-ai-provider was not being actively maintained.
This provider now supports:
- tool streaming and calling for models
- enable/disable thinking
The Ollama provider is available in the ollama-ai-provider-v2
module. You can install it with
npm i ollama-ai-provider-v2
You can import the default provider instance ollama
from ollama-ai-provider-v2
:
import { ollama } from 'ollama-ai-provider-v2';
import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('llama3.2:latest'),
prompt: 'Write a meaty lasagna recipe for 4 people.',
});
import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('qwen3:4b'),
providerOptions: { ollama: { think: true } },
prompt: 'Write a meaty lasagna recipe for 4 people, but really think about it',
});
Please check out the Ollama provider documentation for more information.