Skip to content

Commit 39d5152

Browse files
committed
LangChain: Address suggestions by CodeRabbit
1 parent 5844a05 commit 39d5152

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

.github/workflows/ml-langchain.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@ name: LangChain
22

33
on:
44
pull_request:
5-
branches: ~
65
paths:
76
- '.github/workflows/ml-langchain.yml'
87
- 'topic/machine-learning/llm-langchain/**'

topic/machine-learning/llm-langchain/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ and [CrateDB].
9090
It is based on the previous notebook, and it illustrates how to use Vertex AI platform
9191
on Google Cloud for RAG pipeline.
9292

93-
- `agent_with_mcp.py`
93+
- `agent_with_mcp.py` [![Open on GitHub](https://img.shields.io/badge/Open%20on-GitHub-lightgray?logo=GitHub)](agent_with_mcp.py)
9494

9595
This example illustrates how to use LangGraph and the `langchain-mcp-adapters`
9696
package to implement an LLM agent that is connecting to the CrateDB MCP server.
@@ -173,7 +173,7 @@ pytest -k document_loader
173173
pytest -k "notebook and loader"
174174
```
175175

176-
To force a regeneration of the Jupyter Notebook, use the
176+
To force regeneration of Jupyter notebooks, use the
177177
`--nb-force-regen` option.
178178
```shell
179179
pytest -k document_loader --nb-force-regen

topic/machine-learning/llm-langchain/agent_with_mcp.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@
2424
python agent_with_mcp.py
2525
"""
2626
import asyncio
27+
import os
2728

2829
from cratedb_about.instruction import GeneralInstructions
2930
from langchain_mcp_adapters.client import MultiServerMCPClient
@@ -46,7 +47,7 @@ async def amain():
4647
prompt=GeneralInstructions().render(),
4748
)
4849

49-
QUERY_STR = "What is the average value for sensor 1?"
50+
QUERY_STR = os.getenv("DEMO_QUERY", "What is the average value for sensor 1?")
5051
response = await agent.ainvoke({"messages": QUERY_STR})
5152
answer = response["messages"][-1].content
5253

0 commit comments

Comments
 (0)