Execute Python scripts #4680
-
Hi team, Is there any way I can run python scripts in the flowise workflow? Say this is my scenario:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
How to Run Python Scripts in Flowise Workflow Scenario:
Solution Options 🔹 Option 1: Use Custom Function Node + External Python API Recommended if you're hosting Python scripts or have a backend server. Steps:
Visual Flow: [User Input] Pros: Simple and modular Supports any Python logic Scalable 🔹 Option 2: Use Langchain's PythonREPLTool in a Custom Agent Flowise uses Langchain under the hood. If you’re self-hosting, you can create a custom agent that uses Langchain's PythonREPLTool. How: Create a custom node or agent Enable Python code execution inside it Send output to LLM and show result Requirements: Self-hosted Flowise Intermediate Langchain knowledge 🔹 Option 3: Use Node.js child_process to Execute Python If you're customizing Flowise nodes (Node.js environment), you can use: const { exec } = require('child_process'); exec('python3 script.py', (error, stdout, stderr) => { Then return the output back to Flowise nodes. Summary Table Task Tool / Method Run Python script Flask / FastAPI endpoint Python Script with FastAPI Create a FastAPI server to run your Python code. main.py from fastapi import FastAPI app = FastAPI() class InputData(BaseModel): @app.post("/run-script") ** Install Dependencies** pip install fastapi uvicorn uvicorn main:app --host 0.0.0.0 --port 8000 |
Beta Was this translation helpful? Give feedback.
How to Run Python Scripts in Flowise Workflow
Scenario:
Run a Python script
Send the script output to an LLM
Capture the LLM's output
Display the result to the user
Solution Options
🔹 Option 1: Use Custom Function Node + External Python API
Recommended if you're hosting Python scripts or have a backend server.
Steps:
Create an API using Flask or FastAPI to run your Python script and return the output.
In Flowise, use the “API” node or “Custom Function” node to call this API.
Send the API response to the LLM Node (OpenAI, Anthropic, etc.).
Display the final response to the user using a UI node or Chat Interface.
Visual Flow:
[User Input]
↓
[API Node] → cal…