Replies: 3 comments 4 replies
-
¡Hola de nuevo, @fabiancpl! Espero que todo esté yendo de maravilla desde la última vez. 🚀 In the LangChain framework, the equivalent function to Here is an example of how you might use it: from langchain.agents import create_gemini_functions_agent
from langgraph.prebuilt import create_agent_executor
from langchain_community.tools.bing_search.tool import BingSearchRun
from langchain_community.utilities import BingSearchAPIWrapper
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_gemini import ChatGemini
gemini_chat = ChatGemini()
search_tool = BingSearchRun(api_wrapper=BingSearchAPIWrapper())
tools = [search_tool]
prompt = ChatPromptTemplate.from_messages(
[
("system", "Eres un asistente de compra en una plataforma de comercio electrónico, pero no siempre estás enterado de los últimos lanzamientos de los diferentes productos. Para estos casos puedes optar por realizar una búsqueda en internet para obtener información lo más actualizada posible."),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
gemini_agent = create_gemini_functions_agent(gemini_chat, tools, prompt)
gemini_agent_executor = create_agent_executor(gemini_agent, tools) Please note that you need to have the Gemini environment properly set up, including any necessary authentication if Gemini requires it. You should also understand the specific capabilities and limitations of the Gemini functions you plan to use, as this will influence the design and functionality of your agent. Consider error handling and retries, as network or API issues can occur. Pay attention to the performance implications of using remote functions, including latency and potential rate limits. If your agent will be processing sensitive data, ensure compliance with relevant data protection and privacy regulations. Unfortunately, I couldn't find a specific module in the LangChain codebase that handles the integration with Gemini similar to how For setting up the Gemini environment for LangChain, you can follow the steps provided in the context above. These steps involve setting up the OpenAI API key, configuring Astra DB, optionally configuring a Cassandra cluster, saving and applying the configuration, and verifying the environment variables. I hope this helps! If you have any further questions or need more clarification, please let me know. Sources
|
Beta Was this translation helpful? Give feedback.
-
I found this: |
Beta Was this translation helpful? Give feedback.
-
This is probably pretty messed up- made by Claude and untested but maybe it can help figure stuff out. I will work on it more but I am not a coder- try at own risk. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am interested in building an agent similar to this but using Gemini function calling. Is there something similar to
create_openai_functions_agent
for Gemini? What other considerations I need to have into account?System Info
openai = "^1.3.7"
langchain-core = "0.1.31"
langchain = "0.1.12"
langchain-community = "0.0.28"
langchain-google-genai = "^0.0.9"
langchain-openai = "^0.0.8"
langgraph = "^0.0.28"
Beta Was this translation helpful? Give feedback.
All reactions