Replies: 6 comments
-
For reference; here is a video that explains how Model Context Protocol (MCP) works and why it will be key to standardizing Agentic AI: |
Beta Was this translation helpful? Give feedback.
-
FYI, Paulus spoke about MCP during the Home Assistant 2025.2 release party video, check out his explaination (starting at timestamp 56:40): And their voice development team also spoke about it on the Voice Chapter 9 video: Also check out: |
Beta Was this translation helpful? Give feedback.
-
Using the patterns from Fabric as tools in N8N allow me to customize and improve each pattern, giving access to different integrations and using it with a MCP-Server, it allow me to use all the tools as a USB making easy the integration of each pattern as a tool, feels like giving super powers to the AI Using the MCP-Server -Fabric in Claude i can notice how easy is for the ai to work with the model context protocol using different tools, creating the chain of thought bringing a output more precise and smart, but Claude just allow me 182 tools as max but for home integration it could be easy set it.
|
Beta Was this translation helpful? Give feedback.
-
A2A (Agent2Agent Protocol) integration support might also be relevant as a complementary feature to supporting the MCP protocol? A2A is a new open protocol enabling communication and interoperability between agentic AI applications (that is complementary to the MCP protocol. Any thoughts on if can integrate support for Google's "A2A" (Agent2Agent) open protocol? A2A is an open protocol that complements Anthropic's Model Context Protocol (MCP), which provides helpful tools and context to agents. A2A protocol is designed to address the challenges identified in deploying large-scale, multi-agent systems. A2A empowers developers to build agents capable of connecting with any other agent built using the protocol and offers users the flexibility to combine agents from various providers. Critically, businesses benefit from a standardized method for managing their agents across diverse platforms and cloud environments. We believe this universal interoperability is essential for fully realizing the potential of collaborative AI agents. https://youtube.com/watch?v=rAeqTaYj_aI While introducing A2A, Google claims building AI agentic system demands two layers:
MCP focuses on the first category: organizing what agents, tools, or users send into the model, whereas A2A focuses on the second category: coordination between intelligent agents. On the other hand, by separating tools from agents, Google is able to position A2A as complementary to — rather than in competition with — MCP. https://youtu.be/vIfagfHOLmI?si=pKEOugt3oZJlWRaj https://www.youtube.com/watch?v=voaKr_JHvF4 An open protocol enabling communication and interoperability between opaque agentic applications. One of the biggest challenges in enterprise AI adoption is getting agents built on different frameworks and vendors to work together. That’s why we created an open Agent2Agent (A2A) protocol, a collaborative way to help agents across different ecosystems communicate with each other. Google is driving this open protocol initiative for the industry because we believe this protocol will be critical to support multi-agent communication by giving your agents a common language – irrespective of the framework or vendor they are built on. With A2A, agents can show each other their capabilities and negotiate how they will interact with users (via text, forms, or bidirectional audio/video) – all while working securely together. See A2A in ActionWatch this demo video to see how A2A enables seamless communication between different agent frameworks. Conceptual OverviewThe Agent2Agent (A2A) protocol facilitates communication between independent AI agents. Here are the core concepts:
|
Beta Was this translation helpful? Give feedback.
-
Everyone, feel free to jump in on my request for comments on this post: #1454 seeking input on my Fabric MCP server that I'll implement in the next few days. |
Beta Was this translation helpful? Give feedback.
-
Please continue this discussion here: #1536 Fabric-MCP is released and it is very powerful! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Anyone from this project (or the Home Assistant community) working on a “Fabric integration” AI interface component for Home Assistant?
This is indirecly related to this feature request -> #1387
Please read how the Open Home Foundation and Home Assistant's founders describe the idea/concept of AI agents in smart homes:
For reference; Home Assistant is currently the largest open-source project on GitHub so most here have probably already heard of it, however here is a summery of Home Assistant from its wikipedia-article: "Home Assistant is free and open-source software used for home automation. It serves as an integration platform and smart home hub, allowing users to control smart home devices.", which includes its own built-in "Assist" virtual assistant with various pipelines, (and as an overall project is now owned by the non-profit Open Home Foundation who's goals it also is to support the development of open-source projects, and open connectivity and communication standards for smart homes).
Anyway, it would be awesome to have integration(s) for Fabric so can use Fabric's "Patterns" as "mcp-fetch MCP Server" for AI agents / AI tooling that can use Fabric pattern promts as tools can use from Home Assistant, both as a conversation agent and as AI tooling / AI agent framework for automations. Again, please read their vision for how LLMs will play a role here:
Relevant to this is that Home Assistant 2024.2 release added support for "Model Context Protocol" (MCP) server and client that could be used to extend Home Assistant's AI capabilities through file access, database connections, API integrations, and other contextual services?
https://www.home-assistant.io/blog/2025/02/13/voice-chapter-9-speech-to-phrase/#model-context-protocol-brings-home-assistant-to-every-ai
https://www.home-assistant.io/integrations/mcp
See this TL;DR on Home Assistant's Model Context Protocol integration (made by @allenporter):
Follow-up question to that is if Fabric can be used as a Model Context Protocol (MCP) servers? Check out this collection for reference:
Home Assistant (which is now by the top open-source project by contributers) already features some AI tooling as well as many integrations for many LLMs as conversation agents, but they to not provide specific AI tooling such as the "Patterns" that Fabric provide.
Today the AI integrations are for LLMs that normally just use Home Assistant's "Conversation integration" and at most integrate with Home Assistant's "Assist" (Home Assistant's voice assistant pipeline) via its "sentence trigger" to trigger automations.
AI conversation support for Assist in Home Assistant has however signifigantly improved over the past 6-months or so.
Note! Home Assistant's overall integration architecture description is described for developers here:
PS: Slightly off-topic but highly recommed buying Home Assistant Voice Preview Edition smart speaker to play with and that way get a better understanding of its current scope/limitations so can hopefully see the potential of what support for fabric could add to the mix:
Note that the Home Assistant Voice Preview Edition is only reference hardware for a new fully open-source voice ecosystem/platform:
For more back-stroty on their open-source voice ecosystem/platform also check out their official release blog here:
Beta Was this translation helpful? Give feedback.
All reactions