List view
As a user, I want to drag a “Notes” block to a “Summarizer” block, so I can get a summary of my note using an AI model, and see the result on the canvas. Steps to Achieve This Milestone Define a Service Block Create a new block type on the canvas representing an LLM or AI service (e.g., “Summarizer,” “ChatGPT,” “Local LLM”). Visually distinguish it from local asset blocks (different color, icon, etc.). Implement Data Flow When a user draws a wire from a local asset to a service block, trigger a function that sends the asset’s content to the service. For a prototype, you can use a simple API call to OpenAI, Hugging Face Inference API, or a local LLM server (like Ollama or LM Studio). Display the Result Show the output (e.g., summary, answer) as a tooltip, modal, or a new “result” block on the canvas. Optionally, visually indicate that data has crossed the privacy boundary if the service is external. Abstract the Service Layer
No due date•0/4 issues closedCanvas with yellow border Draggable blocks for local/external assets Visual wiring between blocks Visual cue when a wire crosses the privacy boundary
No due date