Help with passthrough endpoints? #7916
Dima-Mediator
started this conversation in
General
Replies: 1 comment 3 replies
-
Following up to @krrishdholakia response in #361 (comment)
general_settings:
master_key: sk-1234
pass_through_endpoints:
# Existing "Custom" endpoint specification as documented https://docs.litellm.ai/docs/proxy/pass_through
- path: "/v1/rerank"
type: custom ## New optional param (defaults to "custom" for backward compatibility)
target: "https://api.cohere.com/v1/rerank"
headers:
Authorization: "bearer os.environ/COHERE_API_KEY"
content-type: application/json
accept: application/json
forward_headers: True
- path "/anthropic"
type: anthropic # Shall work as Anthropic for billing purposes, LiteLLM keys, etc.
litellm_params:
api_base: <...>
api_key: <your-api-key>
<any other parameters that might be applicable for pass-through endpoints of a certain type.>
- path: "/anthropic" # Endpoint path is repeated which indicates load-balanced deployment (similar to models)
type: anthropic
litellm_params:
api_base: <...>
api_key: <your-api-key>
(any other parameters that might be applicable)=
- path: "/anthropic-fallback"
type: anthropic
litellm_params:
api_base: <...>
api_key: <your-api-key>
- path: "/vertex_ai"
type: vertex_ai
enabled: false # Optional param "enabled" (default true). This is how users could explicitly disable a standard endpoint.
# Default behavior (when no config is given) keep as it works today (all standard endpoints available)
# Fallback config for pass-through endpoints. Arguably, supporting fallback is more important then supporting load-balancing.
router_settings:
pass_through_endpoints_fallbacks: [{"/anthropic": ["/anthropic-fallback"]}]
|
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, the documentation to pass-through endpoints is rather scarce. Can someone help?
Is there a way to enable load-balancing (with different back-end provider api keys) for "built-in" pass-through endpoints (ex: /anthropic/v1/messages)? As per the documentation examples it seems to rely on $ANTHROPIC_API_KEY and that's it?
Can Custom Pass-Through Endpoints completely replace built-in endpoints, and is load balancing possible with custom endpoints?
Built-in passthrough endpoints (/anthripic, /gemini etc) support LiteLLM virtual keys auth by default, but the same for Custom Pass-Through Endpoints requires setting " auth: true " and is an Enterprise feature - is this correct?
What exactly is the "use_client_credentials_pass_through_routes" setting doing? Is it for custom pass-through endpoint or all (including built-in)? Is it entperise-only?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions