Skip to content

update docs for server and custom-llm auth #448

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 28, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 37 additions & 0 deletions fern/customization/custom-llm/using-your-server.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,43 @@ Paste the public URL generated by ngrok (e.g., https://your-unique-id.ngrok.io)
**4. Test the Connection:**
Send a test message through the Vapi interface to ensure it reaches your local server and receives a response from the OpenAI API. Verify that the response is displayed correctly in Vapi.

## Authentication (Optional)

For production deployments, you can secure your custom LLM endpoint using authentication. This ensures only authorized requests from Vapi can access your LLM server.

![Custom LLM authentication configuration](../../static/images/server-url/authentication/custom-llm.png)

### Configuration Options

Vapi supports two authentication methods for custom LLMs:

1. **API Key**: Simple authentication where Vapi includes a static API key in request headers. Your server validates this key to authorize requests.

2. **OAuth2 Credentials**: More secure authentication using OAuth2 client credentials flow with automatic token refresh.

### API Key Authentication

When using API Key authentication:
- Vapi sends your API key in the Authorization header to your custom LLM endpoint
- Your server validates the API key before processing the request
- Simple to implement and suitable for basic security requirements

### OAuth2 Authentication

When configuring OAuth2 in the Vapi dashboard:

1. **OAuth2 URL**: Enter your OAuth2 token endpoint (e.g., `https://your-server.com/oauth/token`)
2. **OAuth2 Client ID**: Your OAuth2 client identifier
3. **OAuth2 Client Secret**: Your OAuth2 client secret

### How OAuth2 Works

1. Vapi requests an access token from your OAuth2 endpoint using client credentials
2. Your server validates the credentials and returns an access token
3. Vapi includes the token in the Authorization header for LLM requests
4. Your server validates the token before processing requests
5. Tokens automatically refresh when they expire

## Step 3: Understanding the Communication Flow
**1. Vapi Sends POST Request:**
When a user interacts with your Vapi application, Vapi sends a POST request containing conversation context and metadata to the configured endpoint (your ngrok URL).
Expand Down
41 changes: 28 additions & 13 deletions fern/server-url/server-authentication.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -61,28 +61,43 @@ This could include short lived JWTs/API Keys passed along via the Authorization

For OAuth2-protected webhook endpoints, you can configure OAuth2 credentials that Vapi will use to obtain and refresh access tokens.

#### Configuration
#### Configuration (at the assistant-level)

```json
{
"server": {
"url": "https://your-server.com/webhook"
},
"credentials": {
"webhook": {
"type": "oauth2",
"clientId": "your-client-id",
"clientSecret": "your-client-secret",
"tokenUrl": "https://your-server.com/oauth/token",
"scope": "optional, only needed to specify which scopes to request access for"
"credentials": [
{
"provider": "webhook",
"authenticationPlan": {
"type": "oauth2",
"url": "https://your-server.com/oauth/token",
"clientId": "your-client-id",
"clientSecret": "your-client-secret",
"scope": "optional, only needed to specify which scopes to request access for"
}
}
}
]
}
```

#### Configuration (via our Dashboard)

<Steps>
<Step title="Visit the API Keys page">
Go to [https://dashboard.vapi.ai/keys](https://dashboard.vapi.ai/keys) to manage your OAuth2 credentials.
</Step>
</Steps>

<Frame caption="OAuth2 configuration in the Vapi dashboard">
<img src="../static/images/server-url/authentication/webhook.png" />
</Frame>

#### OAuth2 Flow

1. Vapi makes a request to your token endpoint with client credentials
1. Vapi makes a request to your token endpoint with client credentials (Content-Type `application/x-www-form-urlencoded`)
2. Your server validates the credentials and returns an access token
3. Vapi includes the access token in the Authorization header for webhook requests
4. Your server validates the access token before processing the webhook
Expand All @@ -96,9 +111,7 @@ Your server should return a JSON response with the following format:
{
"access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
"token_type": "Bearer",
"expires_in": 3600,
"refresh_token": "tGzv3JOkF0XG5Qx2TlKWIA", // Optional
"scope": "read write" // Optional, only if scope was requested
"expires_in": 3600
}
```

Expand All @@ -116,3 +129,5 @@ Common error types:
- `invalid_grant`: Invalid or expired refresh token
- `invalid_scope`: Invalid scope requested
- `unauthorized_client`: Client not authorized for this grant type

<Note> If using the OAuth2 flow for authenticating tool calls, make sure the server for the tool is the URL that should be hit *after* we have completed the token exchange. </Note>
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading