Skip to main content

Azure Availability Zone Peering

· 8 min read

In most regions (and odds are, if your area doesn't have Avalibility Zones, it's on the roadmap to be set up), Microsoft Azure has Availability Zones.

Each Azure region features data centres deployed within a latency-defined perimeter. At a high level, these zones consist of 3 separate data centres with independent cooling, power, switching etc.

Azure availability zones

Azure availability zones are connected by a high-performance network with a round-trip latency of less than 2ms. They help your data stay synchronized and accessible when things go wrong. Each zone is composed of one or more datacenters equipped with independent power, cooling, and networking infrastructure. Availability zones are designed so that if one zone is affected, regional services, capacity, and high availability are supported by the remaining two zones.

With availability zones, you can design and operate applications and databases that automatically transition between zones without interruption. Azure availability zones are highly available, fault tolerant, and more scalable than traditional single or multiple datacenter infrastructures.

Availability Zone peering

Today we are going to look into Availability Zone peering:

Each data centre is assigned to a physical zone. Physical zones are mapped to logical zones in your Azure subscription. Azure subscriptions are automatically assigned this mapping when a subscription is created.

Physical Zones vs Logical Zones

There are a few things to be aware of here that I will call out:

  • Physical zones are mapped to logical zones in your Azure subscription.
  • Azure subscriptions are automatically assigned this mapping when a subscription is created.

So what does this mean?

We know we have three separate data centres within a single region:

1Australia East
2Australia East
3Australia East

We can see these zones in the Azure Portal when we create resources:

Azure Avalibility Zone - Selection

This is great for making your solutions redundant against a single data centre failure and spreading your workloads across different zones; services such as Virtual Networks are zone-redundant by default, allowing access to resources across multiple zones out of the box.

One reason you may have all your resources in a single zone could be latency.

Lets us go back to the paragraphs around physical and logical zones and mapping - what does this mean?

What this means is that each of the three data centres is assigned a physical AND logical mapping, so your Azure datacentres look like this:

Zone (Physical)RegionZone (Logical)
1Australia East3
2Australia East2
3Australia East1

When you deploy a resource into an Azure Avalibility Zone and select Zone 1, you choose the Logical Zone, NOT a physical zone.

This means that FOR EACH Microsoft Azure subscription, whether in the same Microsoft Entra ID tenancy or not, Zone 1 can be a different physical data centre.

So if you have resources deployed across multiple subscriptions, and all your resources are deployed to Zone 1 - they MAY NOT be in the same physical data centre.

Azure SubscriptionsRegionZone (Logical)Zone (Physical)
Sub AAustralia East11
Sub BAustralia East13
Sub BAustralia East11

In an example like the above, you have three separate Azure subscriptions, and you have deployed your Virtual Machines and other resources across all Azure subscriptions into Zone 1, 2 of your subscriptions are using the same physical zone for zone 1, and another subscription is using a separate availability zone altogether.


One of the reasons the logical and physical zones are different is due to capacity management; out of habit, many people select Zone 1 - this would mean that certain zones become overpopulated while others are underutilized. The logical zones allow Microsoft some ability to spread the load.

It's worth noting that mapping the Logical to Physical Zones of the Avalibility Zones within your region is done when the subscription is created.

Checking your Zone Peers using PowerShell and the Azure API

During normal business use - you don't need to know any of this; select a zone and deploy; if you have resources across subscriptions and run into additional latency - this may be why, although each availability zone is connected through a dedicated regional low-latency network with a round-trip latency of less than 2ms.

But suppose you are curious or want to delve deeper into your Disaster Recovery and resiliency architecture within a single region. In that case, it can be helpful to know the mapping.

This information isn't fed into the Azure Portal. To find the mapping, we need to query the Azure API directly using the Check Zone Peers endpoint.

To do this, I have written a rough PowerShell script that will register the AvailabilityZonePeering Azure feature that you need to enable the lookup and query the API for the mappings.

# Connect to Azure using Get-AzAccount

# Set the region to 'Australia East'
$region = 'Australia East'

# Get all subscriptions that the account has access to
$subscriptions = Get-AzSubscription | Select-Object -ExpandProperty SubscriptionId

# Get the access token for the authenticated user
$token = (Get-AzAccessToken).Token

# Check if AvailabilityZonePeering feature is enabled and enable it if it's not
$azFeature = Get-AzProviderFeature -ProviderNamespace Microsoft.Resources -FeatureName AvailabilityZonePeering
if (!$azFeature.RegistrationState.Equals("Registered")) {
do {
Register-AzProviderFeature -FeatureName AvailabilityZonePeering -ProviderNamespace Microsoft.Resources
Start-Sleep -Seconds 5
$azFeature = Get-AzProviderFeature -ProviderNamespace Microsoft.Resources -FeatureName AvailabilityZonePeering
} until ($azFeature.RegistrationState.Equals("Registered"))
Write-Host "The AvailabilityZonePeering feature has been enabled."
} else {
Write-Host "The AvailabilityZonePeering feature is already enabled."

# Define the request body for the REST API call
$body = @{
subscriptionIds= $subscriptions | ForEach-Object { 'subscriptions/' + $_ }
location = $region
} | ConvertTo-Json

# Define the request parameters for the REST API call
$params = @{
Uri = "" + $subscriptions[0] +
Headers = @{ 'Authorization' = "Bearer $token" }
Method = 'POST'
Body = $body
ContentType = 'application/json'

# Invoke the REST API and store the response
$availabilityZonePeers = Invoke-RestMethod @Params

# Initialize an empty array for the output
$output = @()

# Loop through each availability zone and its associated peers and add them to the output array
foreach ($i in $availabilityZonePeers.availabilityZonePeers.availabilityZone) {
foreach ($zone in $availabilityZonePeers.availabilityZonePeers[$i-1].peers ) {
$output += New-Object PSObject -Property @{
Zone = $i
MatchesZone = $zone.availabilityZone
SubscriptionId = $zone.subscriptionId
$output += ""

# Output the results
$output | Format-Table

Once we have connected to Microsoft Azure and run the script, we will get an output like the one below, which I ran across my own three subscriptions:


On the right-hand side, we see the 'Zones' - these are the Physical Zones, so Zone 1 to 3.

For each subscription, we can see the Logical Zone mapping as well.

In this example, my subscription of '3bdfd67e-6280-43af-8121-4f04dc84706c', if I were to deploy to Zone 2 in my Azure Portal, would deploy to the same physical datacenter as Zone 1 of: '8df7caa2-95cb-44d1-9ecb-e5220ec6a825'.

As you can also see, my Zone 3 matches the same Zone 3 logically and physically for all my subscriptions, but there are differences between Zone 2 and 1.

Again, during normal business as usual, you don't need to know this - but it's always good to know how this works. If you want confirmation of the resiliency of your architecture across Availability Zones, this is a great way to confirm whether your resources are physically located together - or not.

Azure Architecture - Solution Requirement Consideration Checklist

· 5 min read

Building a cloud solution on Azure can be an exciting yet daunting task.

The key to a successful implementation is carefully planning and considering solution requirements using the guidance of the Microsoft Cloud Adoption and Well Architecture frameworks.

But knowing what questions to ask and data to capture to give you the bigger picture - to not only consider the solution for the short term and long term, can be difficult. This is where the Azure architecture solution requirements checklist comes in.

Leaning on the great frameworks already in place to assist with the Cloud Adoption and Azure Well Architecture frameworks, the solution requirements checklist is intended to act as a way of asking and capturing the requirements of your solutions. It can be a great reminder to discover some of those requirements (whether functional or non-functional) that you may have forgotten about!

I am using the Azure Review Checklist - as a building block! I created a custom checklist intended to work alongside the review checklists - but aimed more at the discovery and requirements-gathering stage to assist with designing the proper outcomes for the business.

At the time of this article, there are 8 main categories, and various sub categories:

  • AI
  • Business
  • Data
  • Governance
  • Infrastructure
  • Microservices
  • Observability
  • Resiliency

Examples of some questions are:

Main AreaSub AreaChecklist itemDescription (optional)
BusinessGoalsWhy are we moving the solution to Azure?Understand the reasoning behind the decision to move to a cloud platform like Azure. Helps to validate the end result reaches this goal.
BusinessGoalsWhat are the business objectives or quantifiable business goals?What is the business objectives (ie Increased sales revenue, cost reduction, customer satisaction, employee productivity
BusinessGoalsWhat outcomes will you achieve for customer?What is the  objectives for the customer? What do they want to achieve using this solution
BusinessGoalsIs there a timeline for building the solution Azure?Asking about the timeline for building a solution in Azure is important to determine resource allocation, budgeting, prioritization, and setting stakeholder expectations.
BusinessGoalsHow many people will be accessing the solution?Asking about the number of people accessing the solution helps to determine the necessary resources and scalability required to accommodate the expected traffic and usage.
BusinessGoalsIs there a targeted event or date for an announcement about the solution's availability on Azure?Timeline for architecture, deployment, testing can help determine what risks, resource requirements and cost and the delivery of solution.
BusinessGoalsDoes the solution impact a Team, Department or organization?Impact on a team, department, or organization helps determine the scope and potential consequences of the solution, ensuring that all relevant stakeholders are considered and accounted for in the decision-making process.
BusinessCustomersWhat are the customer expecations?Customer expectations helps ensure that the solution meets the needs and desires of the end-users, and make sure business outcomes match customer expectations.
BusinessCustomersIs there a deal or customer opportunity associated with having the solution in Azure?Any associated deals or customer opportunities helps to understand the potential financial benefits, vendor offerings and growth opportunities of using Azure as a platform for the solution.

Azure Architecture - Solution requirement considerations

The Azure Architecture - Solution Requirement Consideration checklist, is intended to be a living resource, I am not an expert in all fields so there may be gaps or questions you feel is relevant or missing! Feel free to open a Pull Request to contribute! This is for you!

You can find the latest version of the checklist on GitHub here: lukemurraynz/Azure_Checklists

Using this is simple!

  1. Download the latest version of the excel spreadsheet from: Azure/review-checklists.
  2. Download the latest version of the Azure Architecture checklist.
  3. Open up the review_checklist excel document and click Import checklist from JSON
  4. Import checklist from file
  5. Import checklist from file
  6. Select the downloaded: azure_architecture_checklist.en.json

Once imported, you can now save the excel document, and start adjusting the Severity, Status and add Comments - to capture the information, to then use to architect your solutions!

Note: I do not create, edit or modify the Excel spreadsheet, created by for the Azure Reviews - I simply use it to run my custom checklist. Make sure to check out the Azure Review Checklists!

There are some settings that you might need to change in your system to run macro-enabled Excel spreadsheets. When initially opening the file you may see the following error, which prevents Excel from loading:

Excel cannot open the file 'review_checklist.xlsm' because the file format or file extension is not valid. Verify that the file has not been corrupted and that the file extension matches the format of the file.

In other cases the file opens with the following message, which prevents you from being able to load the checklist items:

Unblock file or add an exception to Windows Security

  1. You might need to unblock the file from the file properties in the Windows File Explorer, so that you can use the macros required to import the checklist content from
  2. Additionally, you might want to add the folder where you cloned this repo to the list of exceptions in Windows Security (in the Virus & Threat Protection section):

Azure Deployment history cleanup with Azure DevOps

· 7 min read

Microsoft Azure has a limit of 800 deployments per resource group. This means that a single resource group can only contain 800 historical deployments at most.

A deployment in Azure refers to the process of creating or updating resources in a resource group.

When deploying resources in Azure, it is essential to keep track of the number of historic deployments in a resource group to ensure that the limit is not exceeded. This is because new deployments will fail if the limit is exceeded, and creating or updating resources in that resource group will not be possible.

If you have CI/CD (Continuous Integration and Continuous Deployment) set up to deploy or change your infrastructure or services with code, it can be easy to reach this limit. Azure will attempt to do this automatically when reaching your limit. Still, you may want to pre-empt any problems if you make many deployments and the system hasn't had time to prune automatically, or this is disabed.

This came up in conversations on Microsoft Q&A, so I thought I would dig into it and put together a possible option.

To avoid exceeding the deployment limit, it may be necessary to clean up old deployments.

This can be done by using a script to remove deployments that are no longer needed.

So let's build an Azure DevOps pipeline that runs weekly to connect to our Microsoft Azure environment and clean up historical deployments.

Microsoft Azure Deployment History Cleanup with Azure DevOps

For this article, I will assume you have an Azure DevOps repository setup and the permissions (Owner) to make the necessary privileged actions to the Microsoft Azure environment to do the design.

Note: Scripts and pipeline are "here".

Deploy and Configure

Create Service Prinicipal
  1. Navigate to the Microsoft Azure Portal
  2. Click on Microsoft Entra ID
  3. Click on App Registrations
  4. Click on: + New Registration
  5. Enter the following information:
    • Name (i.e. SPN.AzDeploymentCleanup)
  6. Click Register
  7. Copy the following for later when we add the SPN to Azure DevOps.
    • Application (client) ID
    • Directory (tenant ID)
  8. Click on Certificates & Secrets
  9. Press + New Client Secret
  10. Enter a relevant description and expiry date and click Add
  11. Copy the value of the new secret (this is essentially your password), and you won't be able to see the matter again.
Create Custom Role & Assign permissions

Now that your service principal has been created, it is time to assign permissions because this script targets all subscriptions under a management group; we are going to set the permissions to that management group so that it flows to all subscriptions underneath it - and in the view of least privileged we will create a Custom Role to apply to our Service Principal.

Create Custom Role

For the deployment history to be completed, we will need the following permissions:

  • Microsoft.Resources/deployments/delete
  • Microsoft.Resources/subscriptions/resourceGroups/read
  • Microsoft.Management/managementGroups/read
  • Microsoft.Resources/subscriptions/read
  • Microsoft.Management/managementGroups/descendants/read
  • Microsoft.Management/managementGroups/subscriptions/read
  • Microsoft.Resources/subscriptions/resourcegroups/deployments/operations/read
  • Microsoft.Resources/subscriptions/resourcegroups/deployments/read
  • Microsoft.Resources/subscriptions/resourcegroups/deployments/write
  • Microsoft.Resources/deployments/read
  1. Navigate to the Microsoft Azure Portal
  2. In the search bar above, type in and navigate to Management Groups
  3. Click a management group, click on Access Control (IAM)
  4. Click + Add
  5. Click Add Custom Role
  6. Type in a role name (an example is: AzDeploymentHistoryCleanup)
  7. Check Start from Scratch and next click
  8. Click + Add permissions and the permissions above (you can search for them). Feel free to import the role from a JSON file "here".
  9. Click Next
  10. Add Assignable Scopes (this is the scope you can use to assign a role to - this won't give it to the Service Principal; it will only open it up so we can post it). Make sure you set it at the management group level you are targetting.
  11. Click Review + Create
  12. Click Create
Assign Permissions

Now that the custom role has been created, it is time to assign it to the service principal we made earlier.

  1. Navigate to the Microsoft Azure Portal
  2. In the search bar above, type in and navigate to Management Groups
  3. Click on the management group you want to manage and click on Access Control (IAM)
  4. Click Add
  5. Click Add Role Assignment
  6. Select your custom role (you can toggle the type column, so CustomRoles are first in the list)
  7. Microsoft Azure - add role assignments
  8. Click Members
  9. Make sure 'User, group or service principal' is selected and click + Select Members
  10. Microsoft Azure - Add Roles.
  11. Select your Service Principal created earlier (i.e. SPN.AzDeploymentCleanup)
  12. Click Select
  13. Click Review + assign to assign the role.

Note: Copy the Management Group ID and name, as we will need the information, along with the Service Principal and tenant IDs from earlier, in the next step of setting up Azure DevOps.

Configure Azure DevOps Service Endpoint

Now that the Service Principal and permissions have been assigned in Azure, it's time to create the service connection endpoint that will allow Azure DevOps to connect to Azure.

  1. Navigate to your Azure DevOps organisation.
  2. Create a Project, if you haven't already
  3. Click on Project Settings
  4. Navigate to Service Connection
  5. Click on New service connection
  6. Select Azure Resource Manager
  7. Click Next
  8. Select Service principal (manual)
  9. Click Next
  10. For the scope, choose Group Management
  11. Enter the Management Group ID, the Management Group Name
  12. Time to enter in the Service Principal details copied earlier, for the Service Principal Id paste in the Application ID.
  13. The Service Principal key, enter the secret client value and select the Tenant ID
  14. Click Verify - to verify the connectivity to the azure platform from Azure DevOps
  15. Select Grant access permission to all pipelines and click Verify and save
Configure script and pipeline

Now that we have our:

  • Azure Service Principal
  • Custom role and assignment
  • Service connection

We now need to import the script and pipeline.

If you haven't already done - create a Repo for the AzHistoryCleanup writing.

You can clone (or copy) the files in the AzDeploymentCleanup Repo to your own.

First, we need to copy the name of the Service Principal.

  1. Click Project settings
  2. Click Service Connections
  3. Click on your Service Connection and copy the name (i.e. SC.AzDeploymentCleanup)
  4. Azure DevOps Service Principal
  5. Navigate back to your Repo, and click on AzDeploymentCleanup.yml (this will become your pipeline)
  6. Click Edit
  7. Edit AzDeploymentCleanup YML
  8. Update the variable for ConnectedServiceNameARM to the name of your service connection
  9. Here you can also edit the Script Arguments - for example, in my demo, I am targeting the ManagementGroup named: mg-landing zones and keeping the latest five deployments.
  10. By default, I also have a cron job to schedule this pipeline at 6 AM UTC every Sunday, and you can remove or edit this.
  11. Once your changes are made, click Commit.
  12. Now that your pipeline has been updated, its time to create it - click on Pipelines.
  13. Click New Pipeline
  14. Select Azure Repos Git (YAML)
  15. Select your Repo
  16. Select Azure DevOps repo
  17. Select Existing Azure Pipelines YAML file
  18. Select YAML
  19. Select your Pipeline YAML file and click Continue
  20. Click Save to create the pipeline
  21. Now it's time to run the pipeline! Click Run pipeline
  22. Azure DevOps - Pipeline run
  23. If successful, your script will trigger and clean up the oldest deployment history! This can take several minutes to run if you have a lot of deployments.

Azure Deployments - Cleanup - Comparison 1

Azure Deployments - Cleanup - Comparison 2

You Can't Touch This: How to Make Your Azure Backup Immutable and Secure

· 5 min read

With immutable vaults, Azure Backup ensures that recovery points that are once created cannot be deleted before their intended expiry time. Azure Backup does this by preventing any operations which could lead to the loss of backup data.

Hence, this helps you protect your backups against ransomware attacks and malicious actors by disallowing operations such as deleting backups or reducing retention in backup policies.

Immutable vaults is now Generally available in all regions (March 13th 2023).

An immutable vault can assist in safeguarding your backup data by prohibiting any actions that might result in the loss of recovery points.

Can't touch this

By securing the immutable vault setting, it can be made irreversible, which can prevent any unauthorized individuals from disabling the immutability feature and erasing the backups.

The Immutable vault configuration supports both Recovery Services vaults and Backup vaults.

While Azure Backup stores data in isolation from production workloads, it allows performing management operations to help you manage your backups, including those operations that allow you to delete recovery points. However, in certain scenarios, you may want to make the backup data immutable by preventing any such operations that, if used by malicious actors, could lead to the loss of backups. The Immutable vault setting on your vault enables you to block such operations to ensure that your backup data is protected, even if any malicious actors try to delete them to affect the recoverability of data.

Enabling immutability for the vault is a reversible operation. However, you can make it irreversible to prevent any malicious actors from disabling it (after disabling it, they can perform destructive functions).

The type of operations enabling immutability on the Azure Backup vault can prevent and safeguard from is.

SystemOperation typeDescription
Recovery Services Vault & Backup VaultStop protection with delete dataA protected item can't have its recovery points deleted before their respective expiry date. However, you can still stop protection of the instances while retaining data forever or until their expiry.
Recovery Services VaultModify backup policy to reduce retentionAny actions that reduce the retention period in a backup policy are disallowed on Immutable vault. However, you can make policy changes that result in the increase of retention. You can also make changes to the schedule of a backup policy.
Recovery Services VaultChange backup policy to reduce retentionAny attempt to replace a backup policy associated with a backup item with another policy with retention lower than the existing one is blocked. However, you can replace a policy with the one that has higher retention.

There are three current states for the immutability of the Backup and Recovery Services Vault:

  • Disabled
  • Enabled (soft immutability)
  • Enabled and locked (hard immutability)
State of Immutable vault settingDescription
DisabledThe vault doesn't have immutability enabled and no operations are blocked.
EnabledThe vault has immutability enabled and doesn't allow operations that could result in loss of backups. However, the setting can be disabled.
Enabled and lockedThe vault has immutability enabled and doesn't allow operations that could result in loss of backups. As the Immutable vault setting is now locked, it can't be disabled. Note that immutability locking is irreversible, so ensure that you take a well-informed decision when opting to lock.

Immutable vaults and multi-user authorization can safeguard your backups from various human and technological accidents or disruptions.

Immutable vaults will not affect live or hot backups, such as snapshots.

Using the Azure Portal, let us configure immutability on your Azure Backup Vault.

  1. Navigate to your Recovery Services Vault
  2. Navigate to Properties (under Settings)
  3. Recovery Services Vault - Immutability
  4. Under Immutable vault, select Settings
  5. Click the box to enable vault immutability
  6. Enable vault immutability
  7. Click Apply
  8. The Recovery Services vault will be adjusted, and the status has changed to Enabled but not locked; this means that your vault is now immutable and won't allow operations that will result in the loss of backups; however, you can reverse the change by unticking vault immutability.
  9. Immutable vault - soft
  10. To hard lock, your vault, navigate back into the Immutable vault settings, toggle Locked, and Apply. This cannot be undone, so make this decision thought out, as it will stop the ability to reduce retention policies that will cause the deletion of recovery points, which could lead to increased costs in the longer term.

The Azure Backup vault immutability can also be adjusted using Azure Bicep, reference below.

param vaults_name string = 'rsv'

resource vaults_name_resource 'Microsoft.RecoveryServices/vaults@2022-09-10' = {
name: vaults_rsv_name
location: 'australiaeast'
sku: {
name: 'RS0'
tier: 'Standard'
properties: {
securitySettings: {
immutabilitySettings: {
state: 'Unlocked'

The immutabilitySettings states are:

DisabledImmutability is Disabled
LockedEnabled but locked
UnlockedEnabled but unlocked

Note: I was able to delete a Recovery Vault, with locked Immutability successfully, that didn't have any Recovery points.

Export icons to SVG from the Microsoft Azure Portal using Amazing Icon Downloader

· 2 min read

Have you ever wanted to export an icon from the Microsoft Azure Portal but found yourself having to screenshot the icon at a low definition to include in your documentation or presentations?

Well - using the Amazing Icon Downloader browser plugin, you can export a single or all icons on a specific page in high definition as an SVG (Scalable Vector Graphics) file.

Easily view all icons on a page, works with:

  • Search to filter down long lists of icons
  • Rename and download any single icon
  • Bulk download all icons as a .zip file
  • Works with either Chrome or Edge

The plugin is available on Chrome or Edge browser stores and is intuitive.

Once you install it, it is a matter of browsing the page you want and clicking the extension to export.

Azure Icon Downloader

If you are after Visio stencils and other files, make sure you also check out David Summers Azure Stencil collection.

You may run into an issue where a new feature or icon is released, and the stencil collection hasn't been updated yet - which is where the Amazing Icon Downloader can come in handy.