Replies: 1 comment
-
Here's the workaround (works with bind tools):
Set import os
from typing import Optional
from dotenv import load_dotenv
from langchain_core.utils.utils import secret_from_env
from langchain_openai import ChatOpenAI
from pydantic import Field, SecretStr
load_dotenv()
class ChatOpenRouter(ChatOpenAI):
openai_api_key: Optional[SecretStr] = Field(
alias="api_key", default_factory=secret_from_env("OPENROUTER_API_KEY", default=None)
)
@property
def lc_secrets(self) -> dict[str, str]:
return {"openai_api_key": "OPENROUTER_API_KEY"}
def __init__(self,
openai_api_key: Optional[str] = None,
**kwargs):
openai_api_key = openai_api_key or os.environ.get("OPENROUTER_API_KEY")
super().__init__(base_url="https://openrouter.ai/api/v1", openai_api_key=openai_api_key, **kwargs)
openrouter_model = ChatOpenRouter(model_name="anthropic/claude-3.7-sonnet:thinking") Set the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
I am using models from OpenRouter via ChatOpenAI. This works for invoking and streaming, however tool calling functionality does not work.
It would be great if there is a ChatOpenRouter plugin that will help integrate OpenRouter to my langgraph project.
Motivation
I wanted to use OpenRouter, but it is not available. At least not without specific tunings that I don't know about.
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions