Implementing AI Agents and Copilots using Azure OpenAI (IAIACOIA)

 

Course Overview

This workshop is designed to help you develop AI agents and copilots using Azure OpenAI. The workshop is divided into four modules, each covering different aspects of developing AI solutions with Azure OpenAI. It bundles the following Microsoft Applied Skills courses and deep dives Semantic Kernel. To provide a full picture of the AI development process, we have added the module on monitoring and deploying LLM applications:

Who should attend

AI developers who want to learn how to create AI agents or want to obtain one or more Applied Skills.

Prerequisites

C# and Python knowledge is useful.

Course Objectives

Create AI agents and prepare for the 3 Applied Skills.

Course Content

Module 1: Develop Generative AI Solutions with Azure OpenAI Service

This module introduces Azure OpenAI Service, covering how to access and use it, explore generative AI models, and deploy them. It explains the differences between completions and chat, and how to use prompts to get completions from models. Additionally, it guides testing models in Azure OpenAI Studio’s playgrounds and integrating Azure OpenAI into applications using REST API and SDK. The module also delves into prompt engineering, generating code and images, implementing Retrieval Augmented Generation (RAG), and planning responsible generative AI solutions.

Module 2: Develop custom Copilots with Azure AI Studio

This module provides an introduction to Azure AI Studio, highlighting its core features, capabilities, and use cases. It explains how to build a RAG-based copilot solution with your own data, and the basics of developing copilots with Prompt Flow. The module covers integrating a fine-tuned language model with your copilot and evaluating its performance. It emphasizes the importance of understanding the development lifecycle and using LangChain in Prompt Flow.

Module 3: Develop AI agents using Azure OpenAI and the Semantic Kernel SDK

This module focuses on building AI agents using the Semantic Kernel SDK, starting with understanding the purpose of Semantic Kernel and effective prompting techniques. It explains how to give AI agents skills using Native Functions and create plugins for Semantic Kernel. The module also covers providing state and history using Kernel Memory, using intelligent planners, and integrating various AI services with Semantic Kernel. Additionally, it discusses implementing copilots and agents, completing multi-step tasks, and using personas with agents.

Module 4: Monitoring & Deploying LLM Applications

This module outlines the deployment process for LLM applications, including introductions to Azure Container Apps and how to deploy LLM applications to them. It explains how to scale Azure OpenAI for .NET chat using RAG with Azure Container Apps and manage dynamic sessions. The module also covers monitoring and managing LLM applications to ensure optimal performance.

Prices & Delivery methods

Online Training

Duration
4 days

Price
  • on request
Classroom Training

Duration
4 days

Price
  • on request
 

Schedule

Instructor-led Online Training:   Course conducted online in a virtual classroom.

German

Time zone: Central European Time (CET)

Online Training Time zone: Central European Time (CET) Course language: German
Online Training Time zone: Central European Time (CET) Course language: German
Online Training Time zone: Central European Summer Time (CEST) Course language: German
Online Training Time zone: Central European Summer Time (CEST) Course language: German