Skip to content

chore: update toolbox version #194

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Apr 24, 2025
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 10 additions & 5 deletions .github/sync-repo-settings.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,16 +30,21 @@ branchProtectionRules:
- "conventionalcommits.org"
- "header-check"
# Add required status checks like presubmit tests
- "langchain-python-sdk-pr-py313 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py312 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py311 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py310 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py39 (toolbox-testing-438616)"
- "core-python-sdk-pr-py313 (toolbox-testing-438616)"
- "core-python-sdk-pr-py312 (toolbox-testing-438616)"
- "core-python-sdk-pr-py311 (toolbox-testing-438616)"
- "core-python-sdk-pr-py310 (toolbox-testing-438616)"
- "core-python-sdk-pr-py39 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py313 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py312 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py311 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py310 (toolbox-testing-438616)"
- "langchain-python-sdk-pr-py39 (toolbox-testing-438616)"
- "llamaindex-python-sdk-pr-py313-1 (toolbox-testing-438616)"
- "llamaindex-python-sdk-pr-py312-1 (toolbox-testing-438616)"
- "llamaindex-python-sdk-pr-py311-1 (toolbox-testing-438616)"
- "llamaindex-python-sdk-pr-py310-1 (toolbox-testing-438616)"
- "llamaindex-python-sdk-pr-py39-1 (toolbox-testing-438616)"
requiredApprovingReviewCount: 1
requiresCodeOwnerReviews: true
requiresStrictStatusChecks: true
Expand Down
84 changes: 84 additions & 0 deletions .github/workflows/lint-toolbox-llamaindex.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name: llamaindex
on:
pull_request:
paths:
- 'packages/toolbox-llamaindex/**'
- '!packages/toolbox-llamaindex/**/*.md'
pull_request_target:
types: [labeled]

# Declare default permissions as read only.
permissions: read-all

jobs:
lint:
if: "${{ github.event.action != 'labeled' || github.event.label.name == 'tests: run' }}"
name: lint
runs-on: ubuntu-latest
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
working-directory: ./packages/toolbox-llamaindex
permissions:
contents: 'read'
issues: 'write'
pull-requests: 'write'
steps:
- name: Remove PR Label
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
try {
await github.rest.issues.removeLabel({
name: 'tests: run',
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number
});
} catch (e) {
console.log('Failed to remove label. Another job may have already removed it!');
}
- name: Checkout code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ github.event.pull_request.head.sha }}
repository: ${{ github.event.pull_request.head.repo.full_name }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: "3.13"

- name: Install library requirements
run: pip install -r requirements.txt

- name: Install test requirements
run: pip install .[test]

- name: Run linters
run: |
black --check .
isort --check .

- name: Run type-check
env:
MYPYPATH: './src'
run: mypy --install-types --non-interactive --cache-dir=.mypy_cache/ -p toolbox_llamaindex
2 changes: 1 addition & 1 deletion .github/workflows/schedule_reporter.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ jobs:
contents: 'read'
uses: ./.github/workflows/cloud_build_failure_reporter.yml
with:
trigger_names: "langchain-python-sdk-test-nightly,langchain-python-sdk-test-on-merge,core-python-sdk-test-nightly,core-python-sdk-test-on-merge"
trigger_names: "core-python-sdk-test-nightly,core-python-sdk-test-on-merge,langchain-python-sdk-test-nightly,langchain-python-sdk-test-on-merge,llamaindex-python-sdk-test-nightly,llamaindex-python-sdk-test-on-merge"
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"packages/toolbox-langchain":"0.1.0","packages/toolbox-core":"0.1.0"}
{"packages/toolbox-langchain":"0.1.0","packages/toolbox-core":"0.1.0","packages/toolbox-llamaindex":"0.1.1"}
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,7 @@ Please refer to each API's `CHANGELOG.md` file under the `packages/` directory

Changelogs
-----
- [toolbox-langchain==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-langchain/CHANGELOG.md)
- [toolbox-core==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-core/CHANGELOG.md)
- [toolbox-langchain==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-langchain/CHANGELOG.md)
- [toolbox-llamaindex==0.1.1](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-llamaindex/CHANGELOG.md)

154 changes: 68 additions & 86 deletions packages/toolbox-langchain/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
![MCP Toolbox Logo](https://raw.githubusercontent.com/googleapis/genai-toolbox/main/logo.png)
# MCP Toolbox LangChain SDK
# MCP Toolbox LlamaIndex SDK

This SDK allows you to seamlessly integrate the functionalities of
[Toolbox](https://github.com/googleapis/genai-toolbox) into your LangChain LLM
[Toolbox](https://github.com/googleapis/genai-toolbox) into your LlamaIndex LLM
applications, enabling advanced orchestration and interaction with GenAI models.

<!-- TOC ignore:true -->
Expand All @@ -15,10 +15,7 @@ applications, enabling advanced orchestration and interaction with GenAI models.
- [Loading Tools](#loading-tools)
- [Load a toolset](#load-a-toolset)
- [Load a single tool](#load-a-single-tool)
- [Use with LangChain](#use-with-langchain)
- [Use with LangGraph](#use-with-langgraph)
- [Represent Tools as Nodes](#represent-tools-as-nodes)
- [Connect Tools with LLM](#connect-tools-with-llm)
- [Use with LlamaIndex](#use-with-llamaindex)
- [Manual usage](#manual-usage)
- [Authenticating Tools](#authenticating-tools)
- [Supported Authentication Mechanisms](#supported-authentication-mechanisms)
Expand All @@ -38,41 +35,48 @@ applications, enabling advanced orchestration and interaction with GenAI models.
## Installation

```bash
pip install toolbox-langchain
pip install toolbox-llamaindex
```

## Quickstart

Here's a minimal example to get you started using
[LangGraph](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent):
# TODO: add link
[LlamaIndex]():

```py
from toolbox_langchain import ToolboxClient
from langchain_google_vertexai import ChatVertexAI
from langgraph.prebuilt import create_react_agent
import asyncio

toolbox = ToolboxClient("http://127.0.0.1:5000")
tools = toolbox.load_toolset()
from llama_index.llms.google_genai import GoogleGenAI
from llama_index.core.agent.workflow import AgentWorkflow

from toolbox_llamaindex import ToolboxClient

model = ChatVertexAI(model="gemini-1.5-pro-002")
agent = create_react_agent(model, tools)
async def run_agent():
toolbox = ToolboxClient("http://127.0.0.1:5000")
tools = toolbox.load_toolset()

prompt = "How's the weather today?"
vertex_model = GoogleGenAI(
model="gemini-1.5-pro",
vertexai_config={"project": "project-id", "location": "us-central1"},
)
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=vertex_model,
system_prompt="You are a helpful assistant.",
)
response = await agent.run(user_msg="Get some response from the agent.")
print(response)

for s in agent.stream({"messages": [("user", prompt)]}, stream_mode="values"):
message = s["messages"][-1]
if isinstance(message, tuple):
print(message)
else:
message.pretty_print()
asyncio.run(run_agent())
```

## Usage

Import and initialize the toolbox client.

```py
from toolbox_langchain import ToolboxClient
from toolbox_llamaindex import ToolboxClient

# Replace with your Toolbox service's URL
toolbox = ToolboxClient("http://127.0.0.1:5000")
Expand Down Expand Up @@ -102,85 +106,63 @@ tool = toolbox.load_tool("my-tool")
Loading individual tools gives you finer-grained control over which tools are
available to your LLM agent.

## Use with LangChain
## Use with LlamaIndex

LangChain's agents can dynamically choose and execute tools based on the user
input. Include tools loaded from the Toolbox SDK in the agent's toolkit:

```py
from langchain_google_vertexai import ChatVertexAI
from llama_index.llms.google_genai import GoogleGenAI
from llama_index.core.agent.workflow import AgentWorkflow

model = ChatVertexAI(model="gemini-1.5-pro-002")
vertex_model = GoogleGenAI(
model="gemini-1.5-pro",
vertexai_config={"project": "project-id", "location": "us-central1"},
)

# Initialize agent with tools
agent = model.bind_tools(tools)

# Run the agent
result = agent.invoke("Do something with the tools")
```

## Use with LangGraph

Integrate the Toolbox SDK with LangGraph to use Toolbox service tools within a
graph-based workflow. Follow the [official
guide](https://langchain-ai.github.io/langgraph/) with minimal changes.

### Represent Tools as Nodes

Represent each tool as a LangGraph node, encapsulating the tool's execution within the node's functionality:

```py
from toolbox_langchain import ToolboxClient
from langgraph.graph import StateGraph, MessagesState
from langgraph.prebuilt import ToolNode

# Define the function that calls the model
def call_model(state: MessagesState):
messages = state['messages']
response = model.invoke(messages)
return {"messages": [response]} # Return a list to add to existing messages

model = ChatVertexAI(model="gemini-1.5-pro-002")
builder = StateGraph(MessagesState)
tool_node = ToolNode(tools)

builder.add_node("agent", call_model)
builder.add_node("tools", tool_node)
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=vertex_model,
system_prompt="You are a helpful assistant.",
)

# Query the agent
response = await agent.run(user_msg="Get some response from the agent.")
print(response)
```

### Connect Tools with LLM
### Maintain state

Connect tool nodes with LLM nodes. The LLM decides which tool to use based on
input or context. Tool output can be fed back into the LLM:
To maintain state for the agent, add context as follows:

```py
from typing import Literal
from langgraph.graph import END, START
from langchain_core.messages import HumanMessage

# Define the function that determines whether to continue or not
def should_continue(state: MessagesState) -> Literal["tools", END]:
messages = state['messages']
last_message = messages[-1]
if last_message.tool_calls:
return "tools" # Route to "tools" node if LLM makes a tool call
return END # Otherwise, stop

builder.add_edge(START, "agent")
builder.add_conditional_edges("agent", should_continue)
builder.add_edge("tools", 'agent')

graph = builder.compile()

graph.invoke({"messages": [HumanMessage(content="Do something with the tools")]})
from llama_index.core.agent.workflow import AgentWorkflow
from llama_index.core.workflow import Context
from llama_index.llms.google_genai import GoogleGenAI

vertex_model = GoogleGenAI(
model="gemini-1.5-pro",
vertexai_config={"project": "twisha-dev", "location": "us-central1"},
)
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=vertex_model,
system_prompt="You are a helpful assistant",
)

# Save memory in agent context
ctx = Context(agent)
response = await agent.run(user_msg="Give me some response.", ctx=ctx)
print(response)
```

## Manual usage

Execute a tool manually using the `invoke` method:
Execute a tool manually using the `call` method:

```py
result = tools[0].invoke({"name": "Alice", "age": 30})
result = tools[0].call({"name": "Alice", "age": 30})
```

This is useful for testing tools or when you need precise control over tool
Expand Down Expand Up @@ -250,7 +232,7 @@ auth_tools = toolbox.load_toolset(auth_tokens={"my_auth": get_auth_token})

```py
import asyncio
from toolbox_langchain import ToolboxClient
from toolbox_llamaindex import ToolboxClient

async def get_auth_token():
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
Expand All @@ -261,7 +243,7 @@ toolbox = ToolboxClient("http://127.0.0.1:5000")
tool = toolbox.load_tool("my-tool")

auth_tool = tool.add_auth_token("my_auth", get_auth_token)
result = auth_tool.invoke({"input": "some input"})
result = auth_tool.call({"input": "some input"})
print(result)
```

Expand Down Expand Up @@ -329,7 +311,7 @@ use the asynchronous interfaces of the `ToolboxClient`.

```py
import asyncio
from toolbox_langchain import ToolboxClient
from toolbox_llamaindex import ToolboxClient

async def main():
toolbox = ToolboxClient("http://127.0.0.1:5000")
Expand Down
Loading