Skip to content

Commit 293854f

Browse files
authored
chore: move toolbox-llamaindex package (#192)
* add basic files * ci setup * copy files * header year * fix name * edit trigger names * fix llamaindex readme * update toolbox versions * Update integration.cloudbuild.yaml
1 parent 50e32da commit 293854f

28 files changed

+3833
-95
lines changed

.github/sync-repo-settings.yaml

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -30,16 +30,21 @@ branchProtectionRules:
3030
- "conventionalcommits.org"
3131
- "header-check"
3232
# Add required status checks like presubmit tests
33-
- "langchain-python-sdk-pr-py313 (toolbox-testing-438616)"
34-
- "langchain-python-sdk-pr-py312 (toolbox-testing-438616)"
35-
- "langchain-python-sdk-pr-py311 (toolbox-testing-438616)"
36-
- "langchain-python-sdk-pr-py310 (toolbox-testing-438616)"
37-
- "langchain-python-sdk-pr-py39 (toolbox-testing-438616)"
3833
- "core-python-sdk-pr-py313 (toolbox-testing-438616)"
3934
- "core-python-sdk-pr-py312 (toolbox-testing-438616)"
4035
- "core-python-sdk-pr-py311 (toolbox-testing-438616)"
4136
- "core-python-sdk-pr-py310 (toolbox-testing-438616)"
4237
- "core-python-sdk-pr-py39 (toolbox-testing-438616)"
38+
- "langchain-python-sdk-pr-py313 (toolbox-testing-438616)"
39+
- "langchain-python-sdk-pr-py312 (toolbox-testing-438616)"
40+
- "langchain-python-sdk-pr-py311 (toolbox-testing-438616)"
41+
- "langchain-python-sdk-pr-py310 (toolbox-testing-438616)"
42+
- "langchain-python-sdk-pr-py39 (toolbox-testing-438616)"
43+
- "llamaindex-python-sdk-pr-py313-1 (toolbox-testing-438616)"
44+
- "llamaindex-python-sdk-pr-py312-1 (toolbox-testing-438616)"
45+
- "llamaindex-python-sdk-pr-py311-1 (toolbox-testing-438616)"
46+
- "llamaindex-python-sdk-pr-py310-1 (toolbox-testing-438616)"
47+
- "llamaindex-python-sdk-pr-py39-1 (toolbox-testing-438616)"
4348
requiredApprovingReviewCount: 1
4449
requiresCodeOwnerReviews: true
4550
requiresStrictStatusChecks: true
Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
name: llamaindex
16+
on:
17+
pull_request:
18+
paths:
19+
- 'packages/toolbox-llamaindex/**'
20+
- '!packages/toolbox-llamaindex/**/*.md'
21+
pull_request_target:
22+
types: [labeled]
23+
24+
# Declare default permissions as read only.
25+
permissions: read-all
26+
27+
jobs:
28+
lint:
29+
if: "${{ github.event.action != 'labeled' || github.event.label.name == 'tests: run' }}"
30+
name: lint
31+
runs-on: ubuntu-latest
32+
concurrency:
33+
group: ${{ github.workflow }}-${{ github.ref }}
34+
cancel-in-progress: true
35+
defaults:
36+
run:
37+
working-directory: ./packages/toolbox-llamaindex
38+
permissions:
39+
contents: 'read'
40+
issues: 'write'
41+
pull-requests: 'write'
42+
steps:
43+
- name: Remove PR Label
44+
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
45+
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
46+
with:
47+
github-token: ${{ secrets.GITHUB_TOKEN }}
48+
script: |
49+
try {
50+
await github.rest.issues.removeLabel({
51+
name: 'tests: run',
52+
owner: context.repo.owner,
53+
repo: context.repo.repo,
54+
issue_number: context.payload.pull_request.number
55+
});
56+
} catch (e) {
57+
console.log('Failed to remove label. Another job may have already removed it!');
58+
}
59+
- name: Checkout code
60+
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
61+
with:
62+
ref: ${{ github.event.pull_request.head.sha }}
63+
repository: ${{ github.event.pull_request.head.repo.full_name }}
64+
token: ${{ secrets.GITHUB_TOKEN }}
65+
- name: Setup Python
66+
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
67+
with:
68+
python-version: "3.13"
69+
70+
- name: Install library requirements
71+
run: pip install -r requirements.txt
72+
73+
- name: Install test requirements
74+
run: pip install .[test]
75+
76+
- name: Run linters
77+
run: |
78+
black --check .
79+
isort --check .
80+
81+
- name: Run type-check
82+
env:
83+
MYPYPATH: './src'
84+
run: mypy --install-types --non-interactive --cache-dir=.mypy_cache/ -p toolbox_llamaindex

.github/workflows/schedule_reporter.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,4 +26,4 @@ jobs:
2626
contents: 'read'
2727
uses: ./.github/workflows/cloud_build_failure_reporter.yml
2828
with:
29-
trigger_names: "langchain-python-sdk-test-nightly,langchain-python-sdk-test-on-merge,core-python-sdk-test-nightly,core-python-sdk-test-on-merge"
29+
trigger_names: "core-python-sdk-test-nightly,core-python-sdk-test-on-merge,langchain-python-sdk-test-nightly,langchain-python-sdk-test-on-merge,llamaindex-python-sdk-test-nightly,llamaindex-python-sdk-test-on-merge"

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
{"packages/toolbox-langchain":"0.1.0","packages/toolbox-core":"0.1.0"}
1+
{"packages/toolbox-langchain":"0.1.0","packages/toolbox-core":"0.1.0","packages/toolbox-llamaindex":"0.1.1"}

CHANGELOG.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,5 +2,7 @@ Please refer to each API's `CHANGELOG.md` file under the `packages/` directory
22

33
Changelogs
44
-----
5-
- [toolbox-langchain==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-langchain/CHANGELOG.md)
65
- [toolbox-core==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-core/CHANGELOG.md)
6+
- [toolbox-langchain==0.1.0](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-langchain/CHANGELOG.md)
7+
- [toolbox-llamaindex==0.1.1](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-llamaindex/CHANGELOG.md)
8+

packages/toolbox-langchain/README.md

Lines changed: 68 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
![MCP Toolbox Logo](https://raw.githubusercontent.com/googleapis/genai-toolbox/main/logo.png)
2-
# MCP Toolbox LangChain SDK
2+
# MCP Toolbox LlamaIndex SDK
33

44
This SDK allows you to seamlessly integrate the functionalities of
5-
[Toolbox](https://github.com/googleapis/genai-toolbox) into your LangChain LLM
5+
[Toolbox](https://github.com/googleapis/genai-toolbox) into your LlamaIndex LLM
66
applications, enabling advanced orchestration and interaction with GenAI models.
77

88
<!-- TOC ignore:true -->
@@ -15,10 +15,7 @@ applications, enabling advanced orchestration and interaction with GenAI models.
1515
- [Loading Tools](#loading-tools)
1616
- [Load a toolset](#load-a-toolset)
1717
- [Load a single tool](#load-a-single-tool)
18-
- [Use with LangChain](#use-with-langchain)
19-
- [Use with LangGraph](#use-with-langgraph)
20-
- [Represent Tools as Nodes](#represent-tools-as-nodes)
21-
- [Connect Tools with LLM](#connect-tools-with-llm)
18+
- [Use with LlamaIndex](#use-with-llamaindex)
2219
- [Manual usage](#manual-usage)
2320
- [Authenticating Tools](#authenticating-tools)
2421
- [Supported Authentication Mechanisms](#supported-authentication-mechanisms)
@@ -38,41 +35,48 @@ applications, enabling advanced orchestration and interaction with GenAI models.
3835
## Installation
3936

4037
```bash
41-
pip install toolbox-langchain
38+
pip install toolbox-llamaindex
4239
```
4340

4441
## Quickstart
4542

4643
Here's a minimal example to get you started using
47-
[LangGraph](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent):
44+
# TODO: add link
45+
[LlamaIndex]():
4846

4947
```py
50-
from toolbox_langchain import ToolboxClient
51-
from langchain_google_vertexai import ChatVertexAI
52-
from langgraph.prebuilt import create_react_agent
48+
import asyncio
5349

54-
toolbox = ToolboxClient("http://127.0.0.1:5000")
55-
tools = toolbox.load_toolset()
50+
from llama_index.llms.google_genai import GoogleGenAI
51+
from llama_index.core.agent.workflow import AgentWorkflow
52+
53+
from toolbox_llamaindex import ToolboxClient
5654

57-
model = ChatVertexAI(model="gemini-1.5-pro-002")
58-
agent = create_react_agent(model, tools)
55+
async def run_agent():
56+
toolbox = ToolboxClient("http://127.0.0.1:5000")
57+
tools = toolbox.load_toolset()
5958

60-
prompt = "How's the weather today?"
59+
vertex_model = GoogleGenAI(
60+
model="gemini-1.5-pro",
61+
vertexai_config={"project": "project-id", "location": "us-central1"},
62+
)
63+
agent = AgentWorkflow.from_tools_or_functions(
64+
tools,
65+
llm=vertex_model,
66+
system_prompt="You are a helpful assistant.",
67+
)
68+
response = await agent.run(user_msg="Get some response from the agent.")
69+
print(response)
6170

62-
for s in agent.stream({"messages": [("user", prompt)]}, stream_mode="values"):
63-
message = s["messages"][-1]
64-
if isinstance(message, tuple):
65-
print(message)
66-
else:
67-
message.pretty_print()
71+
asyncio.run(run_agent())
6872
```
6973

7074
## Usage
7175

7276
Import and initialize the toolbox client.
7377

7478
```py
75-
from toolbox_langchain import ToolboxClient
79+
from toolbox_llamaindex import ToolboxClient
7680

7781
# Replace with your Toolbox service's URL
7882
toolbox = ToolboxClient("http://127.0.0.1:5000")
@@ -102,85 +106,63 @@ tool = toolbox.load_tool("my-tool")
102106
Loading individual tools gives you finer-grained control over which tools are
103107
available to your LLM agent.
104108

105-
## Use with LangChain
109+
## Use with LlamaIndex
106110

107111
LangChain's agents can dynamically choose and execute tools based on the user
108112
input. Include tools loaded from the Toolbox SDK in the agent's toolkit:
109113

110114
```py
111-
from langchain_google_vertexai import ChatVertexAI
115+
from llama_index.llms.google_genai import GoogleGenAI
116+
from llama_index.core.agent.workflow import AgentWorkflow
112117

113-
model = ChatVertexAI(model="gemini-1.5-pro-002")
118+
vertex_model = GoogleGenAI(
119+
model="gemini-1.5-pro",
120+
vertexai_config={"project": "project-id", "location": "us-central1"},
121+
)
114122

115123
# Initialize agent with tools
116-
agent = model.bind_tools(tools)
117-
118-
# Run the agent
119-
result = agent.invoke("Do something with the tools")
120-
```
121-
122-
## Use with LangGraph
123-
124-
Integrate the Toolbox SDK with LangGraph to use Toolbox service tools within a
125-
graph-based workflow. Follow the [official
126-
guide](https://langchain-ai.github.io/langgraph/) with minimal changes.
127-
128-
### Represent Tools as Nodes
129-
130-
Represent each tool as a LangGraph node, encapsulating the tool's execution within the node's functionality:
131-
132-
```py
133-
from toolbox_langchain import ToolboxClient
134-
from langgraph.graph import StateGraph, MessagesState
135-
from langgraph.prebuilt import ToolNode
136-
137-
# Define the function that calls the model
138-
def call_model(state: MessagesState):
139-
messages = state['messages']
140-
response = model.invoke(messages)
141-
return {"messages": [response]} # Return a list to add to existing messages
142-
143-
model = ChatVertexAI(model="gemini-1.5-pro-002")
144-
builder = StateGraph(MessagesState)
145-
tool_node = ToolNode(tools)
146-
147-
builder.add_node("agent", call_model)
148-
builder.add_node("tools", tool_node)
124+
agent = AgentWorkflow.from_tools_or_functions(
125+
tools,
126+
llm=vertex_model,
127+
system_prompt="You are a helpful assistant.",
128+
)
129+
130+
# Query the agent
131+
response = await agent.run(user_msg="Get some response from the agent.")
132+
print(response)
149133
```
150134

151-
### Connect Tools with LLM
135+
### Maintain state
152136

153-
Connect tool nodes with LLM nodes. The LLM decides which tool to use based on
154-
input or context. Tool output can be fed back into the LLM:
137+
To maintain state for the agent, add context as follows:
155138

156139
```py
157-
from typing import Literal
158-
from langgraph.graph import END, START
159-
from langchain_core.messages import HumanMessage
160-
161-
# Define the function that determines whether to continue or not
162-
def should_continue(state: MessagesState) -> Literal["tools", END]:
163-
messages = state['messages']
164-
last_message = messages[-1]
165-
if last_message.tool_calls:
166-
return "tools" # Route to "tools" node if LLM makes a tool call
167-
return END # Otherwise, stop
168-
169-
builder.add_edge(START, "agent")
170-
builder.add_conditional_edges("agent", should_continue)
171-
builder.add_edge("tools", 'agent')
172-
173-
graph = builder.compile()
174-
175-
graph.invoke({"messages": [HumanMessage(content="Do something with the tools")]})
140+
from llama_index.core.agent.workflow import AgentWorkflow
141+
from llama_index.core.workflow import Context
142+
from llama_index.llms.google_genai import GoogleGenAI
143+
144+
vertex_model = GoogleGenAI(
145+
model="gemini-1.5-pro",
146+
vertexai_config={"project": "twisha-dev", "location": "us-central1"},
147+
)
148+
agent = AgentWorkflow.from_tools_or_functions(
149+
tools,
150+
llm=vertex_model,
151+
system_prompt="You are a helpful assistant",
152+
)
153+
154+
# Save memory in agent context
155+
ctx = Context(agent)
156+
response = await agent.run(user_msg="Give me some response.", ctx=ctx)
157+
print(response)
176158
```
177159

178160
## Manual usage
179161

180-
Execute a tool manually using the `invoke` method:
162+
Execute a tool manually using the `call` method:
181163

182164
```py
183-
result = tools[0].invoke({"name": "Alice", "age": 30})
165+
result = tools[0].call({"name": "Alice", "age": 30})
184166
```
185167

186168
This is useful for testing tools or when you need precise control over tool
@@ -250,7 +232,7 @@ auth_tools = toolbox.load_toolset(auth_tokens={"my_auth": get_auth_token})
250232

251233
```py
252234
import asyncio
253-
from toolbox_langchain import ToolboxClient
235+
from toolbox_llamaindex import ToolboxClient
254236

255237
async def get_auth_token():
256238
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
@@ -261,7 +243,7 @@ toolbox = ToolboxClient("http://127.0.0.1:5000")
261243
tool = toolbox.load_tool("my-tool")
262244

263245
auth_tool = tool.add_auth_token("my_auth", get_auth_token)
264-
result = auth_tool.invoke({"input": "some input"})
246+
result = auth_tool.call({"input": "some input"})
265247
print(result)
266248
```
267249

@@ -329,7 +311,7 @@ use the asynchronous interfaces of the `ToolboxClient`.
329311
330312
```py
331313
import asyncio
332-
from toolbox_langchain import ToolboxClient
314+
from toolbox_llamaindex import ToolboxClient
333315

334316
async def main():
335317
toolbox = ToolboxClient("http://127.0.0.1:5000")

0 commit comments

Comments
 (0)