Skip to content

Workshop for building intelligent AI solutions using Azure AI Foundry, featuring Vector Search, RAG, Agentic AI, and multi-agent orchestration with LangChain and Azure AI Search.

License

Notifications You must be signed in to change notification settings

jonathanscholtes/azure-ai-foundry-agentic-workshop

Repository files navigation

⚠️
This project is currently in active development and may contain breaking changes.
Updates and modifications are being made frequently, which may impact stability or functionality. This notice will be removed once development is complete and the project reaches a stable release.

Azure AI Foundry Workshop: Vector Search, Agentic AI, and LLM Orchestration

Overview

This Azure AI Foundry Workshop provides participants with practical, hands-on experience in building and deploying Retrieval-Augmented Generation (RAG) and Agentic AI solutions using Azure AI Foundry, Azure AI Search, Azure AI Agent Service, LangGraph and Semantic Kernel Agent Framework.

Participants will learn how to create intelligent agents that not only respond, but also take action. Through integrating OpenAPI endpoints and orchestrating workflows across multiple agents, you’ll build solutions that are dynamic, context-aware, and production-ready.

By the end of the workshop, you'll have:

  • Deploy a fully functional AI workshop environment using Azure AI Foundry.
  • Built RAG pipelines using Azure AI Search and document embeddings.
  • Explored agentic patterns using LangGraph and Semantic Kernel: single-agent, supervisor-agent, and networked agents.
  • Integrated structured external data via OpenAPI and GraphQL endpoints—giving agents the ability to query real-time data and take action through external systems.
  • Built intelligent agents using Python code, while also exploring low-code tools for LLM orchestration and agent implementation.
  • Levereged Model Context Protocol (MCP) to define tools once and expose them uniformly, supporting scalable, modular architectures by grouping tools across MCP servers

🔧 What You’ll Build

  • Vector Search & RAG with Azure AI Search
    Learn to index documents, generate embeddings, and implement semantic retrieval to support LLM-based answers grounded in your data.

  • Agentic AI with LangGraph, Azure AI Agent Service and Semantic Kernel
    Use prebuilt and custom agents to delegate tasks, make decisions, and interact with APIs. Experiment with orchestration patterns including single-agent flows, supervisor models, and decentralized networks.

  • Real-World Integrations with OpenAPI, GraphQL, and MCP
    Connect your agents to external services to perform real-world actions like retrieving live data, triggering workflows, or interacting with apps and systems — leveraging Model Context Protocol (MCP) to define tools once and expose them consistently across scalable, modular architectures.


🛠️ Workshop Steps

Follow these key steps to successfully implement and deploy the workshop:

Step-by-step instructions to deploy Azure AI Foundry and all required services for the workshop environment, including:

  • Azure AI Foundry components: AI Service, AI Hub, Projects, and Compute
  • Azure AI Search for knowledge retrieval and vector-based search
  • Azure Storage Account for document storage and data ingestion
  • Azure Functions for event-driven document chunking, embedding generation, and indexing
  • Azure Web App to enable agent interactions via OpenAPI and GraphQL integrations
  • Azure Container Apps to host NGINX-routed Model Context Protocol (MCP) containers, providing agent tools for weather data, OpenAPI access, and Azure AI hybrid search

Step-by-step instructions for vectorizing document data with Azure AI Search and quickly leveraging the data using Azure AI Foundry's built-in Chat Playground.

  • Processing documents, chunking them, generating embeddings with Ada-002, and indexing them into Azure AI Search
  • Interacting with indexed data for semantic retrieval using the Azure AI Foundry Chat Playground

Discover interactive Notebooks and guides that walk you through building intelligent, task-driven agents. These curated resources cover:

  • Semantic retrieval and vector search powered by Azure AI Search
  • Orchestrating multi-agent workflows with LangGraph, Azure AI Agent Service, and Semantic Kernel Agent Framework
  • Connecting to external tools and services through the Model Context Protocol (MCP) for scalable, modular agent tool invocation
  • Integration with real-time external systems and APIs via OpenAPI and GraphQL, enabling agents to interact with dynamic, structured data sources
  • Built-in agent evaluation using the Azure AI Evaluation SDK to measure groundedness, coherence, and overall performance
  • Tracing and diagnostics with Azure AI Inference Tracer to monitor, debug, and optimize agent behavior across workflows

📐 Workshop Design and Architecture

design

This solution combines the power of Azure AI Foundry, LangGraph, and the Azure AI Agents Service to build an advanced, modular AI orchestration framework. It demonstrates a production-ready architecture designed for scalable, intelligent applications that require real-time reasoning, search, and structured data integration.

At a high level, the architecture consists of the following key components:

Azure AI Foundry Core Services

The deployment includes Azure AI Foundry’s full stack—AI Hub, AI Services, AI Projects, and Compute Instances—providing a secure and managed environment for developing and running generative AI applications. Compute Instances are pre-configured to support Visual Studio Code (web), enabling a browser-based development experience for running and modifying sample notebooks directly within the Foundry environment.

Vector Search and RAG Implementation

Unstructured data, such as PDFs, are preprocessed through an Azure Function that chunks documents, generates vector embeddings using OpenAI’s Ada-002 model, and indexes them into Azure AI Search. This enables Retrieval-Augmented Generation (RAG) capabilities by grounding responses in your custom knowledge base.

Multi-Agent System with LangGraph

Agents are orchestrated using LangGraph, a framework that enables complex workflows through node-based logic. A Supervisor Agent coordinates multiple specialized agents, allowing for role-based delegation, context-aware task execution, and adaptive reasoning.

Tool Integration with OpenAPI and GraphQL

To enable agents to interact with structured external data sources, the solution integrates tools via OpenAPI (for RESTful APIs) and GraphQL (for schema-based query interfaces). These tools extend agent capabilities, allowing them to fetch, query, or write to external systems dynamically during conversation.

Event-Driven Data Ingestion

Document processing is fully event-driven. When a PDF is uploaded to the designated storage container, an Azure Function is triggered to process the document end-to-end—from chunking to indexing—ensuring the search index remains up-to-date.

♻️ Clean-Up

After completing the workshop and testing, ensure you delete any unused Azure resources or remove the entire Resource Group to avoid additional charges.


📜 License

This project is licensed under the MIT License, granting permission for commercial and non-commercial use with proper attribution.


Disclaimer

This workshop and demo application are intended for educational and demonstration purposes. It is provided "as-is" without any warranties, and users assume all responsibility for its use.

About

Workshop for building intelligent AI solutions using Azure AI Foundry, featuring Vector Search, RAG, Agentic AI, and multi-agent orchestration with LangChain and Azure AI Search.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published