Skip to main content

· 6 min read

Imagine if you could validate that your Azure Resources are deployed per the Well-Architected Framework (WAF).. just imagine!

Of a way of validating your services are secure and deployed following the Azure Architecture framework, both before and after the resources have been created!

Imagine no longer! There is a PowerShell module designed specifically for that purpose: PSRule for Azure.

PSRule - Azure

PSRule is a suite of rules to validate resources and infrastructure as code (IaC) using PSRule, and the Azure component uses the base PSRule module.

Features of PSRule for Azure include:

  • Leverage over 200 pre-built rules across five WAF pillars:
    • Cost Optimization
    • Operational Excellence
    • Performance Efficiency
    • Reliability
    • Security
  • Validate resources and infrastructure code pre or post-deployment using Azure DevOps or Github!
  • It runs on macOS, Linux, and Windows.

With over 200 inbuilt rules (and you can add your own), there is a lot of resource types covered, such as (but not limited to):

  • Azure App Service
  • Azure Key vault
  • Azure Virtual Machine
  • Azure Storage
  • Azure Network
  • Azure Public IP

Azure PSRules has been in development since 2019 and is under constant updates and fixes.

PSRule for Azure provides two methods for analyzing Azure resources:

  • Pre-flight - Before resources are deployed from Azure Resource Manager templates.
  • In-flight - After resources are deployed to an Azure subscription.

Pre-flight validation is used to scan ARM (Azure Resource Manager) templates before services are deployed and allow for quality gaps and better information in pull requests to improve and implement your infrastructure as code components.

The in-flight method can also be used in Azure DevOps for validation of Terraform resource deployments etc. Still, in this demo, I will run you through installing the Module and doing an export and scan from your PowerShell console!

We are going to install the PSRule.Azure (based on the Well-Architected Framework & Cloud Adoption Framework).

I recommend keeping the Modules (and as such the in-built rules) up-to-date and do scans at least every quarter or after a major deployment or project to help verify your resources are set up according to some best-practice rules. This does not replace Security Center and Azure Advisor; this is intended to be a supplement.

Install PSRule.Azure

  1. Open PowerShell console and run the following commands:

    #The main Module and base rules to validate Azure resources..
    Install-Module PSRule.Rules.Azure -Scope CurrentUser

Install-Module PSRule 2. Press 'Y' to accept PSGallery as a trusted repository; just a note, you can prevent the confirmation prompt when installing Modules from the PSGallery, by classifying it as a 'Trusted Repository' by running the following. Just be wary that won't get rechallenged:

   Set-PSRepository -Name 'PSGallery' -InstallationPolicy Trusted

3. You should now have the following modules installed:

  • PSRule
  • PSRule.Rules.Azure

Extract Azure Subscription PSRule JSON files

Now that PSRule has been installed, it's time to log in to Azure and extract information regarding your Azure resources for analysis; these extracted files are JSON files containing information, such as your resource names, subscription ID, etc. resource groups in plain text.

As you can see from the screenshot below, we can target specific Subscriptions, Tenancies (yes, as long as the account you have access to has access to the subscription, you can export those as well), Resource Groups and Tags.


Because I want to get the most data available across all resources, I will target everything with the '-All' parameter.

  1. First, we need to connect to the Azure subscription and then connect to the Azure subscription we have access to or are targeting by running the following:


    Get-AzSubscription | ogv -PassThru | Set-AzContext
  2. Now that you have connected its time to export the Azure resource information, run the following PowerShell cmdlet, and point it towards an empty folder:

    Export-AzRuleData -OutputPath c:\temp\AzRuleData -All
  3. If the folder doesn't exist, don't worry - the Export command will create it for you. Depending on how many resources and subscriptions you are extracting, this may take a few minutes.

You should see the JSON files appearing if you open one of these. In addition, you should be able to see information about the resources it has extracted.

Run PSRule across your JSON files

Now that you have extracted the JSON files of your Azure resources, it's now time to analyse them following Microsoft Cloud Adoption and Well Architectured framework and the rules builtin into PSRule.Azure!

You don't need to be connected to Azure; for this analysis, have the PSRule modules installed and access the JSON files.

PSRule.Azure has a few baselines; these baselines contain the rules used to analyse your resources and range from Preview to newly released rules; again, we will target ALL rules, as we are after all recommendations.

  1. In PowerShell, run the following:

    Assert-PSRule -Module 'PSRule.Rules.Azure' -InputPath 'C:\temp\AzRuleDataExport\*.json' -Baseline 'Azure.All'
  2. This will trigger PSRules to scan your extracted JSON files with the ALL rules, and you will get output like below:

  3. Invoke-PSRules

  4. Although it is good being able to see a high level, I prefer to look at it all at once in Excel, so run the following to export the rules to a CSV:

    Invoke-PSRule -Module 'PSRule.Rules.Azure' -InputPath 'C:\temp\AzRuleDataExport\*.json' -Baseline 'Azure.All' | Export-csv C:\temp\AzRuleDataExport\Exported_Data.csv
  5. You should now have a CSV file to review and look for common issues, concerns and work on improving your Azure infrastructure setup!

PS Rules Azure - Export CSV

Note: The export contains the Subscription/Resource Names, so you can definitely see what resources can improve upon; however, I removed it from my screenshot.

Congratulations! You now have more visibility and, hopefully, some useful recommendations for improving your Azure services!

If you want to get a good understanding of the type of data rules, check out my extracted CSV 'here'.

Additional Resources

  • If you found PSRules.Azure interesting; how about getting any Failed rules? How about getting any failed rules pushed to Azure Monitor?

PSRule to Azure Monitor

  • If you are interested in the CI (Continous Integration) options, check out the links below:

Azure DevOps Pipeline & Github Actions

  • Extend the PSRules to include Cloud Adoption Framework as well?

PSRule for Cloud Adoption Framework

  • And finally, creating Custom Rules for your organisation, including Tagging, Naming conventions etc.?

PSRule.Azure Custom Rules

· 5 min read

Microsoft has now added a built-in Monitoring workbook for Azure Virtual Desktop performance monitoring; this monitoring includes dashboards related (but not limited to):

  • Session host diagnostics
  • Connection performance
  • Host performance
  • User login events
  • Remote Desktop client versions

To configure, we first need to create a Log Analytics workspace that both the Host Pool and Workspace will connect to.

Create a Log Analytics workspace

You can use a Log Analytics workspace if it already exists; if not, we will have to create one.

  1. Log in to the Azure Portal
  2. Click on + Create a Resource
  3. Search for: Log Analytics workspace
  4. Click Create
  5. Azure Portal - Log Analytics Marketplace
  6. Here we can select the Resource Group, Name and Location of the Log Analytics workspace we will create.
  7. I am going to create a new Resource Group called: aad_mgmt
  8. I click Create New and enter in the name of the Resource Group
  9. Under Instance Details, make sure you select a name that adheres to your naming governance.
  10. Note: the name of your Log Analytics workspace is now scoped to the Resource Group, so you can have Log Analytics workspaces with the same name, as long as they are indifferent resource groups.
  11. Azure Portal - Create Log Analytics
  12. Click on: Next: Pricing Tier
  13. Select the applicable pricing tier, I only have Pay-as-you-go (Per GB 2018), so I will select that.
  14. Note: You can view the Pricing for Log Analytics on the Pricing Calculator: look at the Pay-As-You rates.
  15. Azure Portal - Create Log Analytics
  16. Click Next: Tags
  17. Enter in any applicable tags, such as Creator, Who it may get billed to, Project ID etc. that’s relevant and select Review + Create
  18. Review the configuration and click Create to create your Log Analytics workspace! (It should take less than a minute.)

Configure Azure Virtual Desktop Insights

  1. Log in to the Azure Portal

  2. Search for: Azure Virtual Desktop

  3. Click on Insights

  4. A Workbook blade should now greet you

  5. This is where we will configure the Azure Virtual Desktop Insights. You can see on the lower right-hand side that we will be deploying Azure Monitor for 'Windows Virtual Desktop v1.0.4' (however, this will be managed by Microsoft, but it is handy to know the version in case of support later on).

  6. Azure Virtual Desktop - Insights

  7. Click on Open Configuration Workbook

  8. Here, select the Log Analytics workspace you created earlier (or want to use)

  9. Select Configure host pool

  10. Azure Virtual Desktop - Insights

  11. Click on Deploy (make sure all your Session Hosta are started so that Azure can deploy and configure the Log Analytics agent on the Virtual Machines)

  12. You can select View Template and Parameters if you want to confirm the host pool and workspace configured.

  13. Azure Virtual Desktop - Insights

  14. While the Diagnostic host pool settings are being configured, click on Configure workspace.

  15. Click on: Deploy

  16. Once the Workspace and Host Pool deployments are done, click on Refresh.

  17. Azure Virtual Desktop - Insights

  18. Confirm that Enabled is: True

  19. Azure Virtual Desktop

  20. The journey is not over yet; now that the Host Pool and Workspace have been configured, we need to add the Session Hosts and configure the performance counters to the same workspace!

  21. Click on: Session host data settings.

  22. Select your Log Analytics workspace

  23. Select Add hosts to workspace

  24. Azure Virtual Desktop

  25. Confirm the Deployment and click Deploy

  26. Wait until the deployment has succeeded, or you may get API errors, then select:

  27. Navigate down and click Configure performance counters

  28. Azure Virtual Desktop

  29. Click on Apply config.

  30. Wait until the deployment has succeeded, or you may get API errors, then select:

  31. Navigate down and click on Configure events

  32. Azure Virtual Desktop

  33. Click on Deploy

    Now click on: Refresh, and you should see 'No missing performance counters', 'No missing events found'.

  34. Azure Virtual Desktop

  35. You have now configured Azure Virtual Desktop Insights!

    It may take a few minutes to an hour to populate and collect the data for some of the events and counters.

  36. Azure Virtual Desktop

    On the plus side, all the data is also in Log Analytics so that it can be queried, and you can set up Alert rules against it and get more visibility into your Azure Virtual Desktop environment and use.

    Azure Virtual Desktop

· 2 min read

When you create a Log Analytics workspace using the Azure Portal, you only get the Pricing or 'Pay-as-you-go' tiers to select.

You used to create a 'Free' tier using the Azure Portal; however, since 2018; they removed it with a change in plans and it became a legacy offering.

However, using PowerShell, you can still create a Log Analytics Free SKU!

The Free pricing tier is a legacy pricing tier that is available for trying Azure Log Analytics. It has a data cap of 500 MB/day and only 7 days of data retention, so it is intended only for testing and is not to be used for production deployments.

You can change a Free Tier Log Analytics workspace to a Pay-as-you-go or commitment tier later.

You cannot change a Log Analytics workspace created on a higher tier back to Free, even using PowerShell, due to adjustments in 2018 around the Log Analytics billing and plans.

Azure Log Analytics - Free

Create a 'Free Tier' Log Analytics using PowerShell

Change the script's variables below to suit your environment, connect to Azure and run the script to create your Log Analytics workspace.

Note: I tested this script on an MSDN subscription, which I've had for a few years and a recent one created a few months back (2021), but there may be limitations on other subscription types that I haven't tested - see blurb below the script.

#Connect to Azure

#Set Variables
$ResourceGroup = 'aad_mgmt'
$Location = 'australiaeast'
$LogAnalyticsName = 'la-free'
$SKU = 'Free'

#Creates Log Analytics Workspace
New-AzOperationalInsightsWorkspace -Location $Location -Name $LogAnalyticsName -Sku $SKU -ResourceGroupName $ResourceGroup

If you get an error: Error Message: Pricing tier doesn't match the subscription's billing model. Read for more details, unfortunately it means that your Subscription is under a different Billing model, and may have been created recently are you are unable to use the 'Free' tier, instead you may have to create it using 'standard' instead.

Azure Log Analytics - Free

· One min read

This is just an additional configuration that may help with sizing and pricing Log Analytics, you can set a 'Daily cap' for the amount of Data you ingest per day, to help restrict cost.

The downside of this is if you reach the cap, you will no longer collect any data, until the following day, meaning you may miss key events or issues.

This is something that I would recommend ONLY to do if you run into any financial constraints, giving you more time time to work through, of course, situation depending.

This is a pretty quick 'How To' so let's get straight into it:

  1. Log in to the Azure Portal
  2. Search for your Log Analytics Workspace
  3. Select Usage and estimated costs
  4. Click on Daily Cap
  5. Set your cap in GB (I put 0.166 as my thinking was 5GB per free each month, so 166MB a day, should cap my Log Analytics workspace, although useful for this demo/lab, it's not a number I would recommend for Production)
  6. Click Ok

Log Analytics - Set Daily Cap

· 12 min read

Microsoft Azure uses Role's to define who can access what - Role-Based Access Control (RBAC).

You may be familiar with some of the more common ones, such as:

  • Owner
  • Contributor
  • Reader

Behind the scenes, each role is a separate grouping of permissions that determine what level of permissions someone or something has in Azure; these permissions are usually in the form of:

  • Read
  • Write
  • Delete
  • Action

Each role can be assigned to a specific Resource, Subscription, Management Group or Resource Group through an 'Assignment' (you assign a role if you give someone Contributor rights to a Resource Group, for example).

These permissions can be manipulated and custom roles created.

Why would you use custom roles you ask? As usual - it depends!

Custom Roles can give people or objects JUST the right amount of permissions to do what they need to do, nothing more and nothing less, an example of this is maybe you are onboarding a support partner, if they are will only be supporting your Logic Apps, WebApps and Backups, you may not want them to be able to log support cases for your Azure resources; instead of attempting to mash several roles together that may give more or fewer rights than you need, you can create a custom role that specifically gives them what they need, you can then increase or decrease the permissions as needed, however, if a built-in role already exists for what you want. There is no need to reinvent the wheel, so use it!

I will run through a few things to help arm you understand and build your own Custom Roles, primarily using PowerShell.

Install the Azure PowerShell Modules

As a pre-requisite for the following, you need to install the Azure (Az) PowerShell Module. You can skip this section if you already have the PowerShell modules installed.

  1. Open Windows PowerShell

  2. Type in:

    Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
    Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force
  3. If you have issues installing the Azure PowerShell module - see the Microsoft documentation directly: Install the Azure Az PowerShell module.

  4. Once you have the Azure PowerShell module installed, you can connect to your Azure subscription using the little snippet below:

    #Prompts for Azure credentials 
    #Prompts Window allowing you to select which Azure Subscription to connect to
    $subscriptionName = (Get-AzSubscription) | Out-GridView -Title 'Select Azure Subscription' -PassThru | Set-AzContext -SubscriptionName $subscriptionName

Export Built-in Azure Roles

One of the best ways to learn about how an Azure Role is put together is to look at the currently existing roles.

  1. The following PowerShell command will list all current Azure roles:

  2. For a more human-readable view that lists the Built-in Azure roles and their descriptions, you can filter it by:

    Get-AzRoleDefinition | Select-Object Name, Description
  3. As you can see in the screenshot below, there are many various roles, from EventGrid Contributor to AgFood Platform Service and more! At the time of this article, there were 276 built-in roles.

  4. Azure Builtin Roles

  5. Now that we have successfully been able to pull a list of the existing roles, we will now export them as JSON files to take a proper look at them.

  6. The PowerShell script below will create a few folders on your computer as a base to work from (feel free to change the folders to suit your folder structure or access rights).

    • c:\Temp
    • c:\Temp\AzureRoles
    • C:\Temp\AzureRoles\BuiltinExports\
    • C:\Temp\AzureRoles\CustomRoles
  7. Once the folders have been created, it will Get the Azure Role definitions and export them into JSON into the BuiltinExports folder to be reviewed.

    New-Item -ItemType Directory -Path c:\Temp -Force
    New-Item -ItemType Directory -Path c:\Temp\AzureRoles -Force
    New-Item -ItemType Directory -Path c:\Temp\AzureRoles\BuiltInExports -Force
    New-Item -ItemType Directory -Path c:\Temp\AzureRoles\CustomRoles -Force

    $a = Get-AzRoleDefinition

    Foreach ($role in $a)
    $name = $role.Name
    Get-AzRoleDefinition -Name ($role).Name | ConvertTo-Json | Out-File c:\Temp\AzureRoles\BuiltInExports\$name.json
  8. Once completed, you should now see the JSON files below:

  9. Azure Role - JSON files

Although you can use Notepad, I recommend using Visual Studio Codeto read these files. This is because Visual Studio Code will help with the syntax as well.

Review Built-in Azure Roles

If you open one of the roles, I will open the Azure Digital Twins Data Owner role; however, it doesn't matter.

You should see the following fields:

  • Name
  • Id
  • IsCustom
  • Description
  • Actions
  • NotActions
  • DataActions
  • NotDataActions
  • AssignableScopes

These fields make up your Role.

Azure Role - JSON

  • The Name field is pretty self-explanatory - this is the name of the Azure Role and what you see in the Azure Portal, under Access control (IAM).

  • Azure Portal - Role

  • The same is true for the: Description field.

    These are essential fields as they should tell the users what resource or resources the role is for and what type of access is granted.

  • The IsCustom field is used to determine if the Azure Role is a custom made policy or not; any user-created Role will be set to True, while any In-Built role will be False.

  • The Actions field is used to determine what management operations can be performed. However, the Azure Digital Twins role doesn't have any (as it is mainly Data Action based) if we look at another Role such as the: Azure Kubernetes Service RBAC Admin role:

    • ""Microsoft.Authorization/*/read",
    • "Microsoft.Insights/alertRules/*",
    • "Microsoft.Resources/deployments/write",

    You can see that it has the rights to Read the permissions, create and delete any Alert rules and update resources.

  • The NotActions field is used to exclude anything from the Allowed actions

  • The DataActions field allows you to determine what data operations can be performed. Usually, these are sub-resource tasks, where management or higher-level operations are performed in the Actions field, more specific resource actions are performed in the DataActions field.

    The NotDataActions field is used to exclude anything from the Allowed actions in the DataActions

To help get a feel of the differences with the Actions, here is a list of Actions and DataActions for the Azure Kubernetes Service RBAC Admin role:

  • Azure Custom Role - JSON
  • And finally, the AssignableScopes is used to specify where the role will be available for assignment, whether it can be assigned at a subscription or resource group or management group level. You will notice that most if not all built-in Azure Roles have an Assignable scope of "/" - this means that it can be assigned everywhere (Subscriptions, Resource Groups, Management Groups etc.).

Review Azure Provider Namespaces

You may have noticed that each Action has a provider. In the example of a Virtual Machine, the provider is Microsoft.Compute.

  1. To get a list of all current Providers, run the following command:

    Get-AzProviderOperation | Select-Object ProviderNamespace -Unique

    At the time of writing, there are 198 current Providers! So that's 198 providers or overall buckets of resources that has permissions over.

  2. We can drill into a provider a bit further to check out current Operations:

    Get-AzProviderOperation -Name Microsoft.Compute/*
  3. This displays a list of all providers within the Microsoft.Compute namespace, such as (but definitely not limited to):

    1. Virtual machines
    2. Virtual Machine Scale Sets
    3. Locations
    4. Disks
    5. Cloud Services
  4. If we wanted to drill into the Virtual Machines providers a bit more, we could filter it like:

    Get-AzProviderOperation -Name Microsoft.Compute/virtualMachines/*
  5. Here we can finally see the available actions, and for example, the following Action will allow you to Read the VM sizes available to a Virtual Machine:

  • Operation: Microsoft.Compute/virtualMachines/vmSizes/read
  • operation name: Lists Available Virtual Machine Sizes
  • ProviderNamespace: Microsoft Compute
  • ResourceName: Virtual Machine Size
  • Description: Lists available sizes the virtual machine can be updated to
  • IsDataAction : False
  1. You can use the PowerShell script below to export all the Providers and their Operations to a CSV for review:

    $Providers = Get-AzProviderOperation

    $results = @()

    ForEach ($Provider in $Providers) {

    $results += [pscustomobject]@{
    'Provider NameSpace' = $Provider.ProviderNamespace
    Description = $Provider.Description
    'Operation Name' = $Provider.OperationName
    Operation = $Provider.Operation
    ResourceName = $Provider.ResourceName



    $results | Export-csv c:\temp\AzureRBACPermissions.csv -NoTypeInformation

Using the namespace, providers and actions, you should now be able to see the power behind Role-based access control and how granular you can get.

Add a Custom Role using PowerShell

Now that we understand how to navigate the Namespaces and Built-In Roles available in Microsoft Azure using PowerShell, now we will create one.

I have created a base template to help you start.

This base template has the following fields that the majority of most custom roles will use:

  • Name
  • IsCustom
  • Description
  • Actions
  • AssignableScopes (make sure you put in the of your Azure subscription, you are assigning the role to.)
  1. Edit these fields (apart from IsCustom, which you should leave as True) as you need.

"properties": {
"roleName": "Custom Role - Template",
"IsCustom": true,
"description": "This is a Template for creating Custom Roles.",
"assignableScopes": [
"permissions": [
"actions": [
"notActions": [],
"dataActions": [],
"notDataActions": []

This Custom Role - Template allows you to read the name of all Resource Groups in a subscription and open a Microsoft Support case.

In my example, I am going to add a new role called:

  • LukeGeek-WebApp Deployment-RW

This role will allow users to Deploy and modify Azure WebApps, among other things!


"properties": {
"roleName": "Custom Role - Template",
"description": "This is a Template for creating Custom Roles.",
"IsCustom": true,
"assignableScopes": [
"permissions": [
"actions": [
"notActions": [],
"dataActions": [],
"notDataActions": []

  1. To add the Custom Role to Azure, I will run the following PowerShell command:

    New-AzRoleDefinition -InputFile "C:\\temp\\AzureRoles\\CustomRoles\\LukeGeek-WebApp Deployment-RW.json" -Verbose

Your new Custom Role has now been uploaded to Azure and can be selected for an assignment.

Add a Custom Role using the Azure Portal

Now that we have been through and investigated the Azure roles and their providers and actions, instead of using PowerShell to look through and create manually, you can use the Azure Portal!

Gasp! Why didn't you tell me earlier about this, Luke?

Well, fellow Azure administrator, I found it easier to look at PowerShell and JSON to explain how the Custom Roles were made, vs staring at the Azure Portal and to be honest, really just because! Like most things in IT there are multiple ways something can be done!

  1. Log in to the Azure Portal
  2. Navigate to your Subscription
  3. Click on Access Control (IAM) on the left-hand side blade
  4. Click on Add
  5. Click on Add Custom Role
  6. Type in the Role Name, for example, WebAdmin-RO
  7. Type in a clear description so that you can remember what this role is used for in a year!
  8. For Baseline permissions, select: Start from Scratch
  9. Click Next
  10. Click Add Permissions
  11. If you want, you can select: Download all permissions to review the providers and actions (very similar to the Get-AzProviderOperation PowerShell command).Azure Portal - Create Custom Role
  12. As you should see, all the Namespace providers are listed with the Actions/Permissions that you can do.
  13. In my example, I am going to search for Microsoft Web Apps
  14. Select all 'Read' operations (remember to look at Data Actions as well, there may be resource level actions you might want to allow or exclude)
  15. Click Add
  16. Azure Portal - Create Custom Role
  17. Review the permissions and click Next
  18. Select your assignable scope (where the Role will be allowed so that you can assign it)
  19. Click Next
  20. You can review and download the JSON for backup later (this is handy if you are going to Automate the creation of roles in the future and want a base to start from)
  21. Click Next
  22. Click Create to create your Custom Role!
  23. Azure Portal - Create Custom Role

Assign a Custom Role using the Azure Portal

Now that you have created your Custom Role - it is time to assign it! So it is actually in use.

  1. Log in to the Azure Portal
  2. Navigate to your Subscription or Resource Group you want to delegate this role to
  3. Click on Access Control (IAM)
  4. Click Add
  5. Click on Role Assignment
  6. Under the 'Role' dropdown, select your Custom Role.
  7. Azure Portal - Add Role Assignments
  8. Now you can select the Azure AD Group/User or Service Principal you want to assign the role to and click Save.
  9. Congratulations, you have now assigned your Custom role!

Assign a Custom Role using PowerShell

You can assign Custom Role's using PowerShell. To do this, you need a few things such as the Object ID, Assignable Scope IDs etc., instead of rehashing it, this Microsoft article does an excellent job of running through the process.