Replies: 13 comments 13 replies
-
Hello @akash9771! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing. To pass from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnableConfig
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableLambda
from langchain_openai import ChatOpenAI
from langchain_community.callbacks import get_openai_callback
import json
def parse_or_fix(text: str, config: RunnableConfig):
fixing_chain = (
ChatPromptTemplate.from_template(
"Fix the following text:\n\n```text\n{input}\n```\nError: {error}"
" Don't narrate, just respond with the fixed data."
)
| ChatOpenAI()
| StrOutputParser()
)
for _ in range(3):
try:
return json.loads(text)
except Exception as e:
text = fixing_chain.invoke({"input": text, "error": e}, config)
return "Failed to parse"
with get_openai_callback() as cb:
output = RunnableLambda(parse_or_fix).invoke(
"{foo: bar}", {"tags": ["my-tag"], "callbacks": [cb]}
)
print(output)
print(cb) In this example, the Additionally, you can use the from langchain_community.utilities.awslambda import LambdaWrapper
# Ensure you have boto3 installed and configured
# pip install boto3
# aws configure
lambda_wrapper = LambdaWrapper(function_name="your_lambda_function_name")
def lambda_handler(event, context):
query = event.get("query", "")
result = lambda_wrapper.run(query)
return {
"statusCode": 200,
"body": json.dumps(result)
}
# Example usage
query = "Your input query"
response = lambda_wrapper.run(query)
print(response) In this example, the |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
@dosu I have used ensured_config but still same issue, PFB LOGS. still LLM is populating
CODE IM USING
|
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Team can u please delete this issue. this contains some private variable in code |
Beta Was this translation helpful? Give feedback.
-
@dosu Team can u please delete this issue. this contains some private variable in code |
Beta Was this translation helpful? Give feedback.
-
@dosu have a similar kind of problem but i have used langserve and add_route(RunnableLambda(my_func), path=“myfunc”) def my_func(input:str, config: RunnableConfig): how can i send runnable config in request body? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to pass the sensitive information at runtime using Runnableconfig to aws lambda tool wrapper, its not getting reflected in my lambda function. do u have any example or notebook where u pass the runnableconfig to tools for any toolwrapper. i am going through below link but here its getting passed in the function defined in the notebook.
https://python.langchain.com/v0.2/docs/how_to/tool_runtime/
https://langchain-ai.github.io/langgraph/how-tos/pass-config-to-tools/
System Info
langgraph, langchain latest
Beta Was this translation helpful? Give feedback.
All reactions