Skip to main content

Azure OpenAI error log summarization with completion

· 2 min read

I was assisting a user on Microsoft Q&A with an issue, that involved looking over some event logs.

The issue itself was related to the Nested Virtualization, with the user unable to install Hyper-V or WSL (Windows Subsystem for Linux), it turned out to be incompatilibies with the SKU size and Secure boot.

But as part of troubleshooting this issue, I recreated the Azure compute environment, this user had and started to delve into the Windows logs.

However, in this case I did something a bit different, I exported the logs as text file and opened up Azure OpenAI, then navigated to Azure OpenAI Studio, clicked on Completion and used the summarization powers of the GPT 3.5 Large language model, to delve into the logs for me:

Azure OpenAI - Summarize Error Log

Copying, the Log into the Completion pane of Azure OpenAI

And using the Prompt of:

----
Summarize all the errors and warnings from above and sort by potential cause of the issues, with the most likely cause first. Format as a table.

Azure OpenAI was able to use the reasoning ability of the GPT 3.5 LLM (Large language Model) to sort through 115 lines of Logs, and work out the probability of what could be causing the root cause of the issue.

As you can see, Azure OpenAI and the LLMs can not just be used as an assistant in writing, studying it can be learned to assist in Incident and root-cause resolution.

Bring Your Data to Life with Azure OpenAI

· 13 min read

Today, we will look at using Azure OpenAI and 'Bring Your Data' to allow recent documentation to be referenced and bring life (and relevance) to your data.

Bring Your Data to Life with Azure OpenAI

The example we are going to use today is the Microsoft Learn documentation for Microsoft Azure.

Our scenario is this:

  • We would like to use ChatGPT functionality to query up-to-date information on Microsoft Azure; in this example, we will look for information on Azure Elastic SAN (which was added in late 2022).

When querying ChatGPT for Azure Elastic SAN, we get the following:

ChatGPT - Azure Elastic SAN Query

Just like the prompt states, ChatGPT only has data up to September 2021 and isn't aware of Elastic SAN (or any other changes/updates or new (or retired) services after this date).

So let us use the Azure OpenAI and bring in outside data (ground data), in this case, the Azure document library, to overlay on top of the GPT models, giving the illusion the model is aware of the data.

To do this, we will leverage native 'Bring Your Own Data' functionality, now in Azure OpenAI - this is in Public Preview as of 04/07/2023.

To be clear, I don't expect you to start downloading from GitHub; this is just an example I have used to add your data. The ability to bring in updated data on Azure, specifically, will be solved by Plugins, such as Bing Search.

To do this, we will need to provision a few Microsoft Azure services, such as:

  1. Azure Storage Account (this will hold the Azure document library (markdown files) in a blob container)
  2. Cognitive Search (this search functionality, is the glue that will hold this solution together, by breaking down and indexing the documents in the Azure blob store)
  3. Azure OpenAI (to do this, we will need GPT3.5 turbo or GPT4 models deployed)
  4. Optional - Azure Web App (this can be created by the Azure OpenAI service, to give users access to your custom data)

Make sure you have Azure OpenAI approved for your subscription

We will use the following tools to provision and configure our services:

  1. Azure Portal
  2. PowerShell (Az Modules)
  3. AzCopy

Download Azure Documents

First, we will need the Azure documents to add to our blob storage.

The Microsoft Learn documentation is open-sourced and constantly updated using a git repository hosted on GitHub. We will download and extract the document repository locally (roughly 6 GB). To do this, we will use a PowerShell script:

     $gitRepoUrl = "https://github.com/MicrosoftDocs/azure-docs"
$localPath = "C:\temp\azuredocs"
$zipPath = "C:\temp\azurdocs.zip"
#Download the Git repository and extract
Invoke-WebRequest -Uri "$gitRepoUrl/archive/master.zip" -OutFile $zipPath
Expand-Archive -Path $zipPath -DestinationPath $localPath

Create Azure Storage Account

Now that we have a copy of the Azure document repository, it's time to create an Azure Storage account to copy the data into. To create this storage account, we will use PowerShell.

     # Login to Azure
Connect-AzAccount
# Set variables
$resourceGroupName = "azuredocs-ai-rg"
$location = "eastus"
$uniqueId = [guid]::NewGuid().ToString().Substring(0,4)
$storageAccountName = "myaistgacc$uniqueId"
$containerName = "azuredocs"
# Create a new resource group
New-AzResourceGroup -Name $resourceGroupName -Location $location
# Create a new storage account
New-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName -Location $location -SkuName Standard_LRS -AllowBlobPublicAccess $false
# Create a new blob container
New-AzStorageContainer -Name $containerName -Context (New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0])

We have created our Resource Group and Storage account to hold our Azure documentation.

Upload Microsoft Learn documentation to an Azure blob container

Now that we have the Azure docs repo downloaded and extracted and an Azure Storage account to hold the documents, it's time to use AzCopy to copy the documentation to the Azure storage account. We will use PowerShell to create a SAS token (open for a day) and use that with AzCopy to copy the Azure repo into our newly created container.

     # Set variables
$resourceGroupName = "azuredocs-ai-rg"
$storageAccountName = "myaistgacc958b"
$containerName = "azuredocs"
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0]
$localPath = "C:\Temp\azuredocs\azure-docs-main"
$azCopyPath = "C:\tools\azcopy_windows_amd64_10.19.0\AzCopy.exe"
# Construct SAS URL for destination container
$sasToken = (New-AzStorageContainerSASToken -Name $containerName -Context (New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey) -Permission rwdl -ExpiryTime (Get-Date).AddDays(1)).TrimStart("?")
$destinationUrl = "https://$storageAccountName.blob.core.windows.net/$containerName/?$sasToken"
# Run AzCopy command as command line
$command = "& `"$azCopyPath`" copy `"$localPath`" `"$destinationUrl`" --recursive=true"
Invoke-Expression $command

Note: I took roughly 6 minutes to copy the Azure docs repo from my local computer (in New Zealand) into a blob storage account in East US, so roughly a gigabyte a minute.

Azure Storage Account - Microsoft Learn Docs

Now that we have our Azure Blob storage accounts, it's time to create our Cognitive Search. We will need to create a Cognitive Search, with an SKU of Standard, to support the 6GBs of Azure documents that must be indexed. Please check your costs; this is roughly NZ$377.96 a month to run; you can reduce this cost by limiting the amount of data you need to index (i.e., only certain documents, not an entire large repository of markdown files). Make sure you refer to the Pricing Calculator.

     # Set variables
$resourceGroupName = "azuredocs-ai-rg"
$searchServiceName = "azuredocssearchservice" #Cognitive Service name needs to be lowercase.
# Create a search service
Install-Module Az.Search
$searchService = New-AzSearchService -ResourceGroupName $resourceGroupName -Name $searchServiceName -Location "eastus" -Sku Standard

Now that the cognitive search has been created, we need to create the index, and indexers, which will index our Azure documents to be used by Azure OpenAI by creating the index and linking it to the Azuredocs blob container, we created earlier.

Note: There is no PowerShell cmdlet support for Azure Cognitive Search indexes; you can create using the RestAPI - but we will do this in the Azure Portal as part of the next step.

Create Azure Cognitive Search Index

It's time to time to create the Cognitive Search Index, an indexer that will index the content.

We will move away from PowerShell and into the Microsoft Azure Portal to do this.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Cognitive Search
  3. Click on Cognitive Search
  4. Click on your newly created Cognitive Search Azure Portal - Cognitive Search
  5. Select Import Data
  6. Select Azure Blob Storage Azure Portal - Cognitive Search - Add Azure Blob Storage
  7. Type in your data source name (i.e., azuredocs)
  8. For the Connection string, select Choose an existing connection
  9. Select your Azure Storage account and a container containing the Azure document repository uploaded earlier.
  10. Click Select Azure Portal - Cognitive Search - Add Azure Blob Storage
  11. Click Next: Add cognitive skills (Optional)
  12. Here, you can Enrich your data, such as enabling OCR (extracting text from images automatically) or extracting people's names, and translating text from one language to another; these enrichments are billed separately, and we won't be using any enrichments so we will select Skip to: Customize target index.
  13. Here is the index mapping that was done by Cognitive Search automatically by scanning the schema of the documents. You can bring in additional data about your documents if you want, but I am happy with the defaults, so I click: Next: Create an indexer Azure Portal - Cognitive Search - Search Index
  14. The indexer is what is going to create your index, which will be referenced by Azure OpenAI later; you can schedule an indexer to run hourly, if new data is being added to the Azure blob container where your source files are sitting, for my purposes I am going leave the Schedule as: Once
  15. Uncollapse Advanced Options and scroll down a bit
  16. Here, we can select to only index certain files; for our purposes, we are going to exclude png files, the Azure document repository contains png images files that aren't able to be indexed (we aren't using OCR), so I am going to optimize the indexing time slightly by excluding them. You can also exclude gif/jpg image files. Azure Portal - Cognitive Search - Create Search Indexer
  17. Finally, hit Submit to start the indexing process. This could take a while, depending on the amount of data
  18. Leave this running in the background and navigate to the Cognitive Search resource, Overview pane to see the status. Azure Portal - Cognitive Search - Indexer

Note: You can also run the Import Data in Azure OpenAI Studio, which will trigger an index - but you need to keep your browser open and responsive. Depending on how much data you are indexing, doing it manually through this process could be preferred to avoid browser timeout. You also get more options around the index.

Create Azure OpenAI

Now that we have our base Azure resources, it's time to create Azure OpenAI. Make sure your region and subscription have been approved for Azure OpenAI.

Run the following PowerShell cmdlets to create the Azure OpenAI service:

To create the Azure OpenAI service, we will be using the Azure Portal.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. In the Cognitive Services, Azure OpenAI section, click + Create
  4. Select your subscription, region, name, and pricing tier of your Azure OpenAI service (remember certain models are only available in specific regions - we need GPT 3.5+), then select Next Azure OpenAI - Create Resource
  5. Update your Network Configuration; for this demo, we will select 'All Networks' - but the best practice is to restrict it. Click Next Azure OpenAI - Create Resource
  6. If you have any Tags, enter them, then click Next
  7. The Azure platform will now validate your deployment (an example is ensuring that the Azure OpenAI has a unique name). Review the configuration, then click Create to create your resource. Azure OpenAI - Create Resource

Now that the Azure OpenAI service has been created, you should now have the following:

  • An Azure OpenAI service
  • A Storage account
  • A Cognitive Search service

Azure OpenAI - RAG Deployed Resources

Deploy Model

Now that we have our Azure OpenAI instance, it's time to deploy our Chat model.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. Open your Azure OpenAI instance to the Overview page, and click: Go to Azure OpenAI Studio
  4. Click on Models, and verify that you have gpt models (ie, gpt-36-turbo, or gpt-4). If you don't, then make sure you have been onboarded.
  5. Once verified, click on Deployments
  6. Click on + Create new deployment
  7. Select your model (I am going to go with gpt-35-turbo), type in a deployment name, and then click Create
  8. Once deployment has been completed, you may have to wait up to 5 minutes for the Chat API to be aware of your new deployment.

Azure OpenAI - Deploy Model

Run Chat against your data

Finally, it's time to query and work with our Azure documents.

We can do this using the Chat Playground, a feature of Azure OpenAI that allows us to work with our Chat models and adjust prompts.

We will not change the System Prompt (although to get the most out of your own data, I recommend giving it ago); the System Prompt will remain as: You are an AI assistant that helps people find information.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. Open your Azure OpenAI instance to the Overview page, and click: Go to Azure OpenAI Studio
  4. Click Chat
  5. Click Add your data (preview) - if this doesn't show, ensure you have deployed a GPT model as part of the previous steps.
  6. Click on + Add a data source
  7. Select the dropdown list in the Select data source pane and select Azure Cognitive Search
  8. Select your Cognitive Search service, created earlier
  9. Select your Index
  10. Check, I acknowledge that connecting to an Azure Cognitive Search account will incur usage to my account. View Pricing
  11. Click Next
  12. It should bring in the index metadata; for example - our content data is mapped to content - so I will leave this as is; click Next
  13. I am not utilizing Semantic search, so I click Next
  14. Finally, review my settings and click Save and Close
  15. Now we can verify our own data is getting checked by leaving the: Limit responses to your own data content checked
  16. Then, in the Chat session, in the User message, I type: Tell me about Azure Elastic SAN?
  17. It will now reference the Cognitive Search and bring in the data from the index, including references to the location where it found the data!

Azure OpenAI - Chat Playground

Optional - Deploy to an Azure WebApp

Interacting with our data in the Chat playground can be an enlightening experience, but we can go a step further and leverage the native tools to a chat interface - straight to an Azure web app.

To do this, we need to navigate back to the Chat playground, ensure you have added your own cognitive search, and can successfully retrieve data from your index.

  1. Click on Deploy to
  2. Select A new web app...
  3. If you have an existing WebApp, you can update it with an updated System Message or Search Index from the Chat playground settings, but we will: Create a new web app
  4. Enter a suitable name (i.e., AI-WebApp-Tst - this needs to be unique)
  5. Select the Subscription and Resource Group and location to deploy to. I had issues accessing my custom data, when deployed to Australia East (as AI services are in East US), so I will deploy in the same region as my OpenAI and Cognitive Search service - i.e., East US.
  6. Specify a plan (i.e., Standard (S1))
  7. Click Deploy

Azure OpenAI - Deploy

Note: Deployment may take up to 10 minutes to deploy; you can navigate to the Resource Group you are deploying to, select Deployments, and monitor the deployment. Once deployed, it can take another 10 minutes for authentication to be configured.

Note: By default, the Azure WebApp is restricted to only be accessible by yourself; you can expand this out by adjusting the authentication.

Once it is deployed, you can access the Chat interface, directly from an Azure WebApp!

Azure OpenAI - Run Azure WebApp

If you navigate to the WebApp resource in the Azure Portal, and look at the Configuration and Application Settings of your WebApp, you can see variables used as part of the deployment. You can adjust these, but be wary as it could break the WebApp, I would recommend redeploying/updating the WebApp for major changes, from Azure OpenAI studio. Azure OpenAI - App Service - App Settings

Azure OpenAI - Call to LLM failed

· 2 min read

I recently, ran into an error when attempting to use Azure OpenAI and custom data, in the Chat Playground. The error I was getting was:

Error Cannot connect to host aidocsopenaiaccount.openai.azure.com:443 ssl:default [Name of Service not known] Call to LLM failed.

Call to LLM failed.

As I was calling data, that was indexed by Azure Cognitive Search, I thought the index was corrupted or invalid, but after mulitple index attempts, continuted to have the same error, all other prompts (not using my custom data) succeeded.

I recreated the Azure OpenAI service, and was able to succesffuly call the custom data, so started to look at the comparisons between the Azure OpenAI instance, that was working and the Azure OpenAI instance that wasn't.

I did this by taking a look at the JSON of each Azure OpenAI instance, then did a comparison, and discovered a major difference.

Azure OpenAI - Diff

Note: The one on the left is the working one, the one of the right is the failed one.

The version that was working, had an endpoint of https://*.openai.azure.com , and the version that didn't work, had a different endpoint: eastus.api.cognitive.microsoft.com.

You can check the JSON output of a resource, by navigating to it on the Azure Portal, in the Overview Pane, clicking JSON view.

Azure OpenAI - JSON View

So why would this have been the case?

When I originally created the Azure OpenAI instance that had this issue, I used PowerShell to create the instance:

$skuName = "S0"
$kind = "OpenAI"
# Create Cognitive Services resource
New-AzCognitiveServicesAccount -ResourceGroupName $resourceGroupName -Name $accountName -Location $location -SkuName $skuName -Kind $kind

This seemed to call an older API or different schema.

The version that worked, I created using the Azure Portal - which was correct.

If you run into the same problem, then recreate your Azure OpenAI instance using the Azure Portal, or one of the currently supported methods: Create a resource and deploy a model using Azure OpenAI, currently (as of the 4th of July 2023) PowerShell is not a supported method of creating an Azure OpenAI instance, until the cmdlets have been updated.

Azure Management Groups not displaying in the Azure Portal

· 2 min read

When logging into the Microsoft Azure Portal, to view your Management Group You might have found it blank or constantly attempting to load.

Azure Management Group - Not loading

It looks like a potential bug in the Portal interface, especially if you have the correct permissions to see those Management Groups. Here is a few things to look for:

Elevated rights - Global Administrator

By default, if you are a Global Administrator, it does not allow you to see Azure Resources and manage Azure management groups.

If you have a Global Administrator role, and this is the first time you are setting up Management Groups, then you can elevate your rights, to manage the Azure Management Groups.

Note: Because of the elevated nature of this role, it is recommended to enable it only for the period you need to do your work, and make sure you have specific roles assigned to manage your Azure infrastructure, as necessary.

If this is not the first time you set up Management Groups, ensure you have the right to see the Management Groups.

Create an Azure Management Group using PowerShell

One of the fixes (workaround) discovered is that creating a Management Group triggers the Azure Portal to update, allowing management.

You can do this using the Azure PowerShell cmdlets, by running:

     New-AzManagementGroup -GroupName 'Contoso'

Once the Management Group is created, you should be able to refresh the Azure Management Group page in the Portal and view your Management Groups, if that doesn't work then log out and back into the Portal.

You can then use the Remove cmdlet to delete the new Management Group you created.

     Remove-AzManagementGroup -GroupName 'Contoso'

Note: This article was based on findings from the Microsoft Q&A article: Management Groups Unavailable in Tenant: Limited Account Control and Organization

Cleanup your unwanted Azure resources on a schedule

· 12 min read

Cleanup your unwanted Azure resources on a schedule

Every few months, I get that dreaded email "Your Microsoft Azure subscription has been suspended" - this is due to creating resources, and leaving them provisioned, so I needed a method of deleting the resources I didn't need, or wanted to spin up for a few days. I also needed away to creating resources that can stay, either for learning or a demo, independent of how the resources were deployed into the environment (via the Azure Portal, Terraform, Bicep).

Naturally I went straight to Azure Automation and using PowerShell.

What I ended up with was a Runbook capable of EXTREME AZURE DESTRUCTION which was exactly what I wanted.

This script is provided as-is with no warranties or guarantees. Use at your own risk. This is not intended to be a script to use in Production, mainly test environments, as this WILL CAUSE massive destruction and irretrievable data loss... You have been warned.

I am not going to go into setting up Azure Automation, if interested you can refer to a few of my blog posts I have done previously that goes through the process:

The script named: Invoke-DakaraSuperWeapon, aptly named as a reference to the Dakara weapon from the TV series Stargate SG1 - a weapon if great power.

The Dakara superweapon was a Ancient device capable of reducing all matter to its basic elemental components, and/or restructuring it. Possessing the ability to pass through the shields of known ships it also functions (and has been used) as a devastating weapon to kill the entire crew of orbiting ships or wipe out all life on the surface of hundreds of planets at a time. "It is not only capable of destroying the Replicators but all life in the galaxy."

Azure Dakara superweapon

Using the latest Windows PowerShell release - 7.2 (Preview), this script is built around the following capabilities:

  • Delete ALL resource groups (without a specific Tag) under all subscriptions, under a specific Management Group
  • Delete all resources within those resource groups
  • Delete Azure Recovery Vaults and their backed up items
  • Delete any Azure policy assignments, assigned directly to any subscription under the Management Group
  • Delete any Azure RBAC role assignments, assigned directly to any subscription under the Management Group.

In my demo environment, I have a range of Management Groups, and 2 Azure subscriptions.

Luke's Azure Management Group structure`

For my purposes, I created a System Managed Identity from the Azure Automation account, and applied it to the: 'mg' Management Group as 'Owner' (Contributor will work, as long as you don't plan on removing the rights from the Azure subscriptions - theoretically, so could Contributor + User Access Administrator roles).

Again - this was created for my own environment - if you decide to run this, TEST IT! And Make sure it has as limited permissions as possible, potentially the Managed Identity will only have access to a specific test Subscription that you may not care about. I take no responsibily.

The System Identity will be used to execute the runbook.

I also needed a Tag (ie a Safe word) to save the Resource Groups that I need to remain, an example is a project I am working on, demo etc. This Tag is in name only - as Tags are Key/Value pairs in Azure - in this case I only cared about the Key (ie NotDelete) - what was in the value, didn't matter.

NotDelete - Azure Tag

Important: When importing the Runbook it is imperative that you Tag the Resource Group it is in, with your safe word! Or else could will be deleted!

The script has a couple of parameters:

ParameterTypeNotes
ManagementGroupIdStringThe ID of the management group to delete resource groups under. WARNING: This script will delete all resource groups under the specified management group except for the ones with the specified tag. Make sure you have specified the correct management group ID, or you may accidentally delete resources that you did not intend to delete.
TagNameStringThe name of the tag to check for. WARNING: This script will delete all resource groups that do not have this tag. Make sure you have specified the correct tag name, or you may accidentally delete resources that you did not intend to delete.
RemoveResourceGroupsBooleanTrue or False, do you want to Remove the Resource Groups? True means it will, and False means it will skip the Resource Group deletion.
DeletePolicyAssignmentsBooleanTrue or False, do you want to Remove the Azure Policy assignments on the subscriptions? True means it will, and False means it will skip the Azure Policy assignment deletion.
DeleteSubRoleAssignmentsBooleanThis will need Owner rights (or User Administrator role) in order to remove roles from a Subscription. Make sure your rights are set to be inherited from a Management Group, before running this. True or False, True means it will delete the Subscription direct assignments, False means it will skip it.

As you can tell, you can enable or disable specific parts of the script, for example - if you just want to use it to clean up direct role assignments on your subscriptions, while not deleting Azure resources you can by entering True or False.

Initiate-DakaraSuperWeapon - Azure Runbook Parameters

When ran it will stream the Logs to the Azure Automation Log Stream, there is no waiting time or approval - it will just run.

Initiate-DakaraSuperWeapon - Azure Automation Log Stream

As below, you can see the Resource Groups get removed (at the time of this recording, I had a limit on the amount of parallel delete tasks:

Remove Azure Resoure Groups

Initiate-DakaraSuperWeapon.ps1
# This runbook deletes all resource groups under a management group except for the ones with a specific tag.
<#
.SYNOPSIS
Deletes all resource groups under a management group except for the ones with a specific tag.

.DESCRIPTION
This script deletes all resource groups under a specified management group except for the ones with a specific tag. It can also delete policy assignments and subscription role assignments if specified.

.PARAMETER ManagementGroupId
The ID of the management group to delete resource groups under. WARNING: This script will delete all resource groups under the specified management group except for the ones with the specified tag. Make sure you have specified the correct management group ID, or you may accidentally delete resources that you did not intend to delete.

.PARAMETER TagName
The name of the tag to check for. WARNING: This script will delete all resource groups that do not have this tag. Make sure you have specified the correct tag name, or you may accidentally delete resources that you did not intend to delete.

.PARAMETER RemoveResourceGroups
If specified, deletes the resource groups that do not have the specified tag.

.PARAMETER DeletePolicyAssignments
If specified, deletes the policy assignments for the management group and all child subscriptions.

.PARAMETER DeleteSubRoleAssignments
If specified, deletes the subscription role assignments for all child subscriptions.

.EXAMPLE
.\Initiate-DakaraSuperWeapon.ps1 -ManagementGroupId "my-management-group" -TagName "my-tag" -RemoveResourceGroups -DeletePolicyAssignments -DeleteSubRoleAssignments
Deletes all resource groups under the "my-management-group" management group that do not have the "my-tag" tag, and deletes the policy assignments and subscription role assignments for all child subscriptions.

.NOTES
This script requires the Azure PowerShell module to be installed. It also requires Owner rights (or User Administrator role) in order to remove roles from a subscription. Make sure your rights are set to be inherited from a management group before running this script.
#>

param (
[Parameter(Mandatory = $true, HelpMessage = "The ID of the management group to delete resource groups under. WARNING: This script will delete all resource groups under the specified management group except for the ones with the specified tag. Make sure you have specified the correct management group ID, or you may accidentally delete resources that you did not intend to delete.")]
[string]$ManagementGroupId,

[Parameter(Mandatory = $true, HelpMessage = "The name of the tag to check for. WARNING: This script will delete all resource groups that do not have this tag. Make sure you have specified the correct tag name, or you may accidentally delete resources that you did not intend to delete.")]
[string]$TagName,

[Parameter(Mandatory = $false)]
[switch][bool]$RemoveResourceGroups = $false,

[Parameter(Mandatory = $false)]
[switch][bool]$DeletePolicyAssignments = $false,

[Parameter(Mandatory = $false, HelpMessage = "This will need Owner rights (or User Administrator role) in order to remove roles from a Subscription. Make sure your rights are set to be inherited from an Management Group, before running this.")]
[switch][bool]$DeleteSubRoleAssignments = $false
)

# Convert string values to boolean values
$RemoveResourceGroups = [System.Boolean]::Parse($RemoveResourceGroups)
$DeletePolicyAssignments = [System.Boolean]::Parse($DeletePolicyAssignments)
$DeleteSubRoleAssignments = [System.Boolean]::Parse($DeleteSubRoleAssignments)

# Ensures you do not inherit an AzContext in your runbook
Disable-AzContextAutosave -Scope Process

#Toggle to stop warnings with regards to Breaking Changes in Azure PowerShell
Set-Item -Path Env:\SuppressAzurePowerShellBreakingChangeWarnings -Value $true

# Connect to Azure with system-assigned managed identity
(Connect-AzAccount -Identity).context

# Write an initial log message
Write-Output "Initilizing superweapon...."

# Get the subscription IDs under the specified management group AND child management groups
function Get-AzSubscriptionsFromManagementGroup {
param($ManagementGroupName)
$mg = Get-AzManagementGroup -GroupId $ManagementGroupName -Expand
foreach ($child in $mg.Children) {
if ($child.Type -match '/managementGroups$') {
Get-AzSubscriptionsFromManagementGroup -ManagementGroupName $child.Name
}
else {
$child | Select-Object @{N = 'Name'; E = { $_.DisplayName } }, @{N = 'Id'; E = { $_.Name } }
}
}
}
$mgid = Get-AzManagementGroup -GroupId $ManagementGroupID -Expand

$subIds = (Get-AzSubscriptionsFromManagementGroup -ManagementGroupName $mgid.DisplayName).id


# Delete the policy assignments

if ($DeletePolicyAssignments -eq $true) {
Write-Output "Deleting management group policy assignments..."
Get-AzPolicyAssignment -Scope $mgid.Id | Remove-AzPolicyAssignment -Verbose
Write-Output "Deleting subscription group policy assignments..."

foreach ($subId in $subIds) {
Write-Output "Setting subscription context..."
Set-AzContext -Subscription $subId
Write-Output "Deleting subscription group policy assignments..."
Get-AzPolicyAssignment -Scope "/subscriptions/$($subId)" | Remove-AzPolicyAssignment -Verbose

}
}
else {
Write-Output "Skipping policy assignment deletion..."
}

# Delete the resource groups
if ($RemoveResourceGroups -eq $true) {
Write-Output "Deleting resource groups..."

if ($null -ne $subIds -and $subIds.Count -gt 0) {

foreach ($subId in $subIds) {
Write-Output "Setting subscription context..."
Set-AzContext -Subscription $subId

$ResourceGroupsfordeletion = Get-AzResourceGroup | Where-Object { $_.Tags -eq $null -or $_.Tags.ContainsKey($tagName) -eq $false }
Write-Output "The following Resource Groups will be deleted..."
Write-Output -InputObject $ResourceGroupsfordeletion

## Checks to see if a Recovery Services Vaults exists, the Recovery Services Vault and backups need to be deleted first.
$RSV = Get-AzRecoveryServicesVault | Where-Object { $_.ResourceGroupName -in $ResourceGroupsfordeletion.ResourceGroupName }
if ($null -ne $RSV) {

ForEach ($RV in $RSV) {
Write-Output "Backup Vault deletion supports deletion of Azure VM backup vaults ONLY currently."
#Credit to Wim Matthyssen for reference in the backup section of the script - https://wmatthyssen.com/2020/11/17/azure-backup-remove-a-recovery-services-vault-and-all-cloud-backup-items-with-azure-powershell/
Set-AzRecoveryServicesVaultProperty -Vault $RV.ID -SoftDeleteFeatureState Disable
Set-AzRecoveryServicesVaultContext -Vault $RV
$containerSoftDelete = Get-AzRecoveryServicesBackupItem -BackupManagementType AzureVM -WorkloadType AzureVM | Where-Object { $_.DeleteState -eq "ToBeDeleted" }

foreach ($item in $containerSoftDelete) {
Undo-AzRecoveryServicesBackupItemDeletion -Item $item -Force -Verbose
}

$containerBackup = Get-AzRecoveryServicesBackupItem -BackupManagementType AzureVM -WorkloadType AzureVM | Where-Object { $_.DeleteState -eq "NotDeleted" }
foreach ($item in $containerBackup) {
Disable-AzRecoveryServicesBackupProtection -Item $item -RemoveRecoveryPoints -Force -Verbose
}
Remove-AzRecoveryServicesVault -Vault $RV -Verbose

}

}
## Checks to see if a Azure Resource Mover resource exists, as this need to be deleted first.

$ARM = Get-AzResource | Where-Object { $_.ResourceGroupName -in $ResourceGroupsfordeletion.ResourceGroupName -and $_.ResourceType -eq 'Microsoft.Migrate/moveCollections' }

Write-Output -InputObject $ARM

if ($null -ne $ARM) {

ForEach ($RM in $ARM) {
Write-Output "Azure Resource Mover collections exists."
Write-Output -InputObject $RM
$a = Get-AzResourceMoverMoveResource -ResourceGroupName $RM.ResourceGroupName -MoveCollectionName $RM.Name
Foreach ($b in $a) {
Write-Output -InputObject $b
# Remove a resource using the resource ID
Invoke-AzResourceMoverDiscard -ResourceGroupName $RM.ResourceGroupName -MoveResourceInputType $b.Id -MoveResource $b.Name
Remove-AzResourceMoverMoveResource -ResourceGroupName $RM.ResourceGroupName -MoveCollectionName $RM.Name -Name $b.Name -Verbose
}

Remove-AzResourceMoverMoveCollection -ResourceGroupName $RM.ResourceGroupName -MoveCollectionName $RM.Name
}

}

Write-Output "Deleting resource groups..."
$ResourceGroupsfordeletion | ForEach-Object -Parallel {
Remove-AzResourceGroup -Name $_.ResourceGroupName -Force
} -ThrottleLimit 20 -Verbose


# Remove the Network Watcher resource group - if remaining - in some scenarios the script left this RG behind.
# Get the resource group with the specified tag
$networkWatcherRG = Get-AzResourceGroup | Where-Object { $_.ResourceGroupName -eq 'NetworkWatcherRG' }
if ($null -ne $networkWatcherRG -and $null -ne $networkWatcherRG.Tags -and $networkWatcherRG.Tags.ContainsKey($tagName) -eq $false) {
Remove-AzResourceGroup -Name $networkWatcherRG.ResourceGroupName -Force -ErrorAction Continue -Verbose
}
}

# Write a final log message
Write-Output "Resource group deletion process completed."
}
else {
Write-Output "No child subscriptions found under the specified management group."
}

}
else {
Write-Output "Skipping resource group deletion..."
}

if ($DeleteSubRoleAssignments -eq $true) {
if ($null -ne $subIds -and $subIds.Count -gt 0) {

foreach ($subId in $subIds) {
Write-Output "Setting subscription context..."
Set-AzContext -Subscription $subId
$roleAssignments = Get-AzRoleAssignment -Scope "/subscriptions/$($subId)" -IncludeClassicAdministrators
Write-Output -InputObject $roleAssignments
# Loop through each role assignment and delete it if it is not inherited a management group
foreach ($roleAssignment in $roleAssignments) {
if ($roleAssignment.Scope -like "/subscriptions/*" -and $null -ne $roleAssignment.ObjectId -and $roleAssignment.ObjectId -ne "") {
Write-Output "Deleting role assignment..."
Remove-AzRoleAssignment -Scope $roleAssignment.Scope -ObjectId $roleAssignment.ObjectId -RoleDefinitionName $roleAssignment.RoleDefinitionName -Verbose -ErrorAction Continue
}
}
Write-Output "Deleting subscription role assignments..."
}

}

}
else {
Write-Output "Skipping policy subscription role assignments deletion..."
}

Using the Azure Automation schedule, I can then set this Runbook to run every Day, Week etc - knowing my environment will be fresh for my next project, learning exercise.