Skip to main content

Microsoft Learn Contributor Chatmode

· 14 min read

Chat Modes are predefined configurations that enable you to tailor the AI chat behavior in Visual Studio Code for specific tasks, such as asking questions, making code edits, or performing autonomous coding tasks. You can switch between chat modes at any time in the Chat view, depending on the task you want to accomplish. Chat Modes enable us to customise the response from GitHub Copilot, providing more specific tools and commands.

Very much like custom Copilot Instructions, which can work alongside the custom chatmodes, the chatmodes allow more specific structure, in this example, we will create a custom chat mode, that can assist us with writing and editing documentation for Microsoft Learn, aligned to the Microsoft Writing Style.

Deploying Azure Service Groups with Terraform AzAPI

· 7 min read

Microsoft recently unveiled Service Groups in Azure, now available in limited public preview - sign up for the preview and test it out! You need to have been enrolled in the public preview to provision them, either through the Portal or via the API (i.e., Terraform, Bicep, PowerShell, REST API calls, etc.).

If you’ve ever struggled with managing sprawling applications across multiple resource groups, subscriptions, and teams, Service Groups are designed with you in mind.

Where Azure Policy, Resource Graph, Tags, and Management Groups give you compliance, visibility, and hierarchy, Service Groups add an entirely new dimension: flexible, application-centric grouping of your Azure resources, without being limited by deployment boundaries.

Imagine being able to view, report on, and manage an entire application or workload, regardless of where its resources reside. Whether you’re in FinOps trying to track costs, in Ops trying to view health, or a security lead wanting to understand exposure, Service Groups give you the lens you’ve been missing.

Today we are going to look at using the Terraform AzAPI to deploy Service Groups in Azure.

Model Context Protocol (MCP) in VS Code with Microsoft Learn

· 17 min read

The Model Context Protocol is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: Developers can expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers.

There are different types of primitives that an MCP server can expose, which extend the ability of your AI applications and clients to create, read: Resources are a core primitive in the Model Context Protocol (MCP) that allows servers to expose data and content that clients can read and use as context for LLM interactions. Prompts enable servers to define reusable prompt templates and workflows that clients can quickly surface to users and LLMs. They provide a powerful way to standardize and share everyday LLM interactions. Tools are a powerful primitive in the Model Context Protocol (MCP) that enable servers to expose executable functionality to clients. Through tools, LLMs can interact with external systems, perform computations, and take actions in the real world.

In our demo, the client will be Visual Studio Code. It will connect to an MCP server over HTTPS to utilize the document search tool. This tool allows us to retrieve duplicate content that services like Copilot for Azure have access to in the Microsoft Learn semantic search index. By using this index, we can ground our responses in the current documentation. For example, the system will know that Azure Active Directory is now called Entra ID or which services are generally available within a specific region.

Automate Azure Bastion with Drasi Realtime RBAC Monitoring

· 46 min read

Drasi (named after the Greek word for 'Action') is a change data processing platform that automates real-time detection, evaluation, and meaningful reaction to events in complex, event-driven systems, created as part of the Azure Incubation Teams, Drasi was accepted in to the Cloud Native Computing Foundation, at the Sandbox Maturity level in January of 2025.

I was fortunate enough to witness a demo of this in action and wondered how I might learn to use Drasi with something I am familiar with - the Microsoft Azure ecosystem. The Azure Role Assignment Monitor with Drasi was born.

Validate Azure Zone Redundancy with az zones CLI

· 7 min read

Reliability (Resiliency, availability, recovery) is part of one of the main pillars of the Azure Well-Architected Framework. It is essential for ensuring that applications and services remain operational and performant, even in the face of failures or unexpected events. Reliability encompasses various aspects, including fault tolerance, disaster recovery, and high availability.

Reliability is also a shared responsibility between the cloud provider (ie, Microsoft) and the customer. While Azure provides a robust infrastructure and services designed for reliability, customers must also implement best practices and strategies to ensure their applications are resilient and can recover from failures.

ReliabilitySharedResponsibility

But a key question arises: How do we check the reliability (in this example, Zone redundancy of our Workload) ?

One of the tools we can use for this is the az zones command line tool.