Skip to main content

Azure Optimization Engine

· 21 min read

This post is a part of Azure Spring Clean, which is a community event focused on Azure management topics from March 14-18, 2022.

Thanks to Joe Carlyle and Thomas Thornton for putting in the time and organising this event.

This article, along with others of its kind (Articles, Videos etc.), cover Azure Management topics such as Azure Monitor, Azure Cost Management, Azure Policy, Azure Security Principles or Azure Foundations!

Today I will be covering the Azure Optimization Engine.

#AzureSpringClean - Azure Optimization Engine

Overview

The Azure Optimization Engine (AOE) is an extensible solution designed to generate optimization recommendations for your Azure environment, like a fully customizable Azure Advisor.

The first custom recommendations use-case covered by this tool was augmenting Azure Advisor Cost recommendations, particularly Virtual Machine right-sizing, with a fit score based on VM (Virtual Machine) metrics and properties.

The Azure Optimization Engine can…

  • Enable new custom recommendation types
  • Augment Azure Advisor recommendations with richer details that better drive action
  • Add fit score to recommendations.
  • Add historical perspective to recommendations (the older the recommendation, the higher the chances to remediate it)
  • Drive continuous automated optimisation

Azure Optimisation Engine combines multiple data sources to give you better data-driven decisions and recommendations, outside of that usually deployed by the inbuilt Azure Advisor, example use-cases and data sources can be seen below:

  • Azure Resource Graph (Virtual Machine and Managed Disks properties)
  • Azure Monitor Logs (Virtual Machine performance metrics)
  • Azure Consumption (consumption/billing usage details events)
  • Extracts data periodically to build a recommendations history
  • Joins and queries data in an analytics-optimised repository (Log Analytics)
  • Virtual Machine performance metrics collected with Log Analytics agent
  • Can leverage existing customer setup
  • Requires only a few metrics collected with a frequency >= 60 seconds

Besides collecting all Azure Advisor recommendations, AOE includes other custom recommendations that you can tailor to your needs:

  • Cost
    • Augmented Advisor Cost VM right-size recommendations, with fit score based on Virtual Machine guest OS metrics (collected by Log Analytics agents) and Azure properties
    • Underutilized VM Scale Sets
    • Unattached disks
    • Standard Load Balancers without backend pool
    • Application Gateways without backend pool
    • VMs deallocated since a long time ago (forgotten VMs)
    • Orphaned Public IPs
  • High Availability
    • Virtual Machine high availability (availability zones count, availability set, managed disks, storage account distribution when using unmanaged disks)
    • VM Scale Set high availability (availability zones count, managed disks)
    • Availability Sets structure (fault/update domains count)
  • Performance
    • VM Scale Sets constrained by lack of compute resources
  • Security
    • Service Principal credentials/certificates without expiration date
    • NSG rules referring to empty or non-existing subnets
    • NSG rules referring to orphan or removed NICs
    • NSG rules referring to orphan or removed Public IPs
  • Operational Excellence
    • Load Balancers without backend pool
    • Service Principal credentials/certificates expired or about to expire
    • Subscriptions close to the maximum limit of RBAC (Role Based Access Control) assignments
    • Management Groups close to the maximum limit of RBAC assignments
    • Subscriptions close to the maximum limit of resource groups
    • Subnets with low free IP space
    • Subnets with too much IP space wasted
    • Empty subnets
    • Orphaned NICs

Feel free to skip to the Workbook and PowerBI sections to look at some of the outs of box data and recommendations.

The Azure Optimisation Engine is battle-tested

  • Providing custom recommendations since Nov 2019
  • Serving Azure customers worldwide
  • From smaller 50-500 VMs customers to larger ones with more than 5K VMs
  • Several customer-specific developments (custom collectors and recommendation algorithms)
  • Flexibility options include (multi-subscription and multi-tenant capability)
  • Based on cheap services (Azure Automation, Storage, small SQL Database)

A few hours after setting up the engine, you will get access to a Power BI dashboard and Log Analytic Workbooks with all Azure optimisation opportunities, coming from both Azure Advisor and tailored recommendations included in the engine.

These recommendations are then updated every seven days.

It is worth noting that Azure Optimisation Engine is NOT an official Microsoft Product, and as such is under no offical support, it was created and maintened by: Hélder Pinto, a Senior Customer Engineer for Microsoft and would like to take the opportunity to thank Hélder the amazing work he is doing with this product on a continous basis, and giving me his blessing to write this article, on which he has already done an amazing job documenting on Github.

Architecture

Azure Optimization Engine Architecture

Azure Optimization Engine runs on top of Azure Automation (Runbooks for each data source) and Log Analytics. It is supplemented by a storage account to store JSON and Azure SQL database to help control ingestion (last processed blob and lines processed).

Install

Prerequisites

Taken directly from the Git repository readme, the prerequisite for Azure Optimization Engine are:

  • A supported Azure subscription (see the FAQson Github)
  • Azure Powershell 6.6.0+(Azure Bicep support is not currently available but is being worked on).
  • Microsoft.Graph.Authentication and Microsoft.Graph.Identity.DirectoryManagement PowerShell modules
  • A user account with Owner permissions over the chosen subscription, so that the Automation Managed Identity is granted the required privileges over the subscription (Reader) and deployment resource group (Contributor)
  • (Optional) A user account with at least Privileged Role Administrator permissions over the Azure AD tenant, so that the Managed Identity is granted the required privileges over Azure AD (Global Reader)

During deployment, you'll be asked several questions. It would be best if you planned for the following:

  • Whether you're going to reuse an existing Log Analytics Workspace or create a new one. IMPORTANT: you should ideally reuse a workspace where you have VMs onboarded and already sending performance metrics (Perf table); otherwise, you will not fully leverage the augmented right-size recommendations capability. If this is not possible/desired for some reason, you can still manage to use multiple workspaces (see Configuring Log Analytics workspaces).
  • An Azure subscription to deploy the solution (if you're reusing a Log Analytics workspace, you must deploy into the same subscription the workspace is in).
  • A unique name prefix for the Azure resources being created (if you have specific naming requirements, you can also choose resource names during deployment)
  • Azure region

If the deployment fails for some reason, you can repeat it, as it is idempotent (i.e. they can be applied multiple times without changing the result). The exact process is used to upgrade a previous deployment with the latest version. You have to keep the same deployment options, so make sure you document them.

We will now go through and install the prerequisites from scratch; as in this article, I will be deploying the Azure Optimization Engine from our local workstation.

You can also install from the Azure Cloud Shell,

Install Azure PowerShell & Microsoft Graph modules
  1. Open Windows PowerShell

  2. Type in:

    Install-Module -Name Az,Microsoft.Graph.Authentication,Microsoft.Graph.Identity.DirectoryManagement -Scope CurrentUser -Repository PSGallery -Force

Install

Now that we have the prerequisites installed! Let's set up Azure Optimization Engine!

  1. In your favourite web browser, navigate to the AzureOptimizationEngine GitHub repository.

  2. Select Code, Download Zip

  3. Azure Optimization Engine - GitHub

  4. Download and extract the ZIP file to a location you can easily navigate to in PowerShell (I have extracted it to C:\temp\AzureOptimizationEngine-master\AzureOptimizationEngine-master)

  5. Open PowerShell (or Windows Terminal)

  6. Because the scripts were downloaded from the internet, we will need to Unblock these so that we can run them, open PowerShell and run the script below (changing your path to the path that the files were extracted)

    Get-ChildItem -r 'C:\temp\AzureOptimizationEngine-master\AzureOptimizationEngine-master' | Unblock-File
  7. Now that the script and associated files have been unblocked change the directory to the location of the Deploy-AzureOptimizationEngine.ps1 file.

  8. Run: .\Deploy-AzureOptimizationEngine.ps1

  9. Windows Terminal -\Deploy-AzureOptimizationEngine.ps1

  10. A browser window will then popup, authenticate to Azure (connect to the Azure tenant that has access to the Azure subscription you wish to set up Azure Optimization Engine on).

  11. Once authentication, you will need to confirm the Azure subscription to which you want to deploy Azure Optimization Engine.

  12. Azure Optimization Engine - Select Subscription

  13. Once your subscription is selected, it's time to choose a naming prefix for your resources (if you choose Enter, you can manually name each resource); in my case, my prefix will be: aoegeek. Because Azure Optimization Engine will be creating resources that are globally available, make sure you select a prefix that suits your organisation/use-case as you may run into issues with the name already being used.

  14. Azure Optimization Engine - Select Region

  15. If you have an existing Log Analytics workspace that your Virtual Machines and resources are connected to, you can specify 'Y' here to select your existing resource; I am creating this from fresh so that I will choose 'N.'

  16. Azure Log Analytics

  17. The Azure Optimization Engine will now check that the names and resources are available to be deployed to your subscriptions and resources (nothing is deployed during this stage - if there is an error, you can fix the issue and go back).

  18. Once validation has passed, select the region that Azure Optimization will be deployed to; I will deploy to australiaeast, so I choose 1.

  19. Azure Optimization Engine now requires the SQL Admin username; for the SQL server and database it will create, I will go with: sqladmin

  20. Azure Optimization Engine - Region

  21. Now enter the password for the sqladmin account and press Enter

  22. Verify that everything is correct, then press Y to deploy Azure Optimization Engine!

  23. Windows Terminal - Deploy Azure Optimization Engine

  24. Deployment could take 10-25 minutes... (mine took 22 minutes and 51 seconds)

  25. While leaving the PowerShell window open, log into the Azure Portal; you should now have a new Resource Group, and your resources will start getting created... you can click on Deployments (under Settings navigation bar) in the Resource Group to review the deployment status.

  26. Azure Portal - Deployments

  27. If you notice a failure, in the Deployment tab for: 'PolicyDeployment' you can ignore this, as it may have failed if the SQL Server hasn't been provisioned yet; once it has been provisioned, you can navigate back to this failed deployment and click 'Redeploy', to deploy a SQL Security Alert policy.

Note: The Azure SQL database will have the Public IP from the location the script was deployed from, allowed on the Azure SQL database; you may need to adjust this depending on your requirements.

Configure

Onboard Azure VMs to Log Analytics using Azure Policy and PowerShell

Now that Azure Optimization has been installed, let's onboard our current and future Azure Virtual Machines to Azure Optimization Engine, using Azure Policy. This is required if you want to get Azure Advisor Virtual Machine right-size recommendations augmented with guest OS metrics. If you don't collect metrics from the Virtual Machines, you will still have a fully functional Optimisation Engine, with many recommendations, but the Advisor Virtual Machine right-size ones will be served as is.

  1. Open PowerShell and login to Azure using: Connect-AzAccount

  2. Connect to your Azure subscription that contains the Virtual Machines you want to onboard to Log Analytics

  3. Type:

    # Register the resource provider if it's not already registered
    Register-AzResourceProvider -ProviderNamespace 'Microsoft.PolicyInsights'
  4. The PowerShell script below will:

  5. Just update the variables to match your setup

    #requires -Version 1.0
    # Variables
    #Enter your subscription name
    $subscriptionName = 'luke.geek.nz'
    #Enter the name of yuour
    $policyDisplayName = 'Deploy - Log Analytics' #Cant Exceed 24 characters
    $location = 'australiaeast'
    $resourceGroup = 'aoegeek-rg'
    $UsrIdentityName = 'AOE_ManagedIdentityUsr'
    $param = @{
    logAnalytics = 'aoegeek-la'
    }
    # Get a reference to the subscription that will be the scope of the assignment
    $sub = Get-AzSubscription -SubscriptionName $subscriptionName
    $subid = $sub.Id
    #Creates User Managed identity
    $AzManagedIdentity = New-AzUserAssignedIdentity -ResourceGroupName $resourceGroup -Name $UsrIdentityName
    #Adds Contributor rights to User Managed identity to Subscription
    #Waits 10 seconds to allow for Azure AD to replicate and recognise Managed identity has been created.
    Start-Sleep -Seconds '10'
    #Assigns role assignement to managed identity
    New-AzRoleAssignment -Objectid $AzManagedIdentity.PrincipalId -scope ('/subscriptions/' + $subid ) -RoleDefinitionName 'Log Analytics Contributor'
    # Get a reference to the built-in policy definition that will be assigned
    $definition = Get-AzPolicyDefinition | Where-Object -FilterScript {
    $_.Properties.DisplayName -eq 'Deploy - Configure Log Analytics extension to be enabled on Windows virtual machines'
    }
    # Create the policy assignment with the built-in definition against your subscription
    New-AzPolicyAssignment -Name $policyDisplayName -DisplayName $policyDisplayName -Scope ('/subscriptions/' + $subid ) -PolicyDefinition $definition -IdentityType 'UserAssigned' -IdentityId $AzManagedIdentity.id -location $location -PolicyParameterObject $param
    #Creates R3mediation task, to deploy the extension to the VM
    $policyAssignmentID = Get-AzPolicyAssignment -Name $policyDisplayName | Select-Object -Property PolicyAssignmentId
    Start-AzPolicyRemediation -Name 'Deploy - LA Agent' -PolicyAssignmentId $policyAssignmentID.PolicyAssignmentId -ResourceDiscoveryMode ReEvaluateCompliance

Note: The default 'Deploy - Configure Log Analytics extension to be enabled on Windows virtual machines' policy doesn't currently support Gen 2 or Windows Server 2022 Virtual Machines; if you have these, then you can copy the Azure Policy definition and then make your own with the new imageSKUs, although this policy may be replaced by the: Configure Windows virtual machines to run Azure Monitor Agent policy. Although I haven't tested it yet, the same script above can be modified to suit.

Onboard Azure VMs to Log Analytics using the Azure Portal

If you do not want to onboard VMs with Policy, you can do it manually via the Azure Portal.

  1. Open Azure Portal
  2. Navigate to Log Analytic Workspaces
  3. Click on the Log Analytic workspace that was provisioned for Azure Optimization Engine
  4. Navigate to Virtual Machines (under Workspace Data Sources)
  5. Click on the Virtual Machine you want to link up to the Log Analytics workspace, and click Connect - this will trigger the Log Analytic extension and agent o be installed. Repeat for any further Virtual Machines.
  6. Log Analytics - Connect VM
Setup Log Analytic Performance Counters

Now that we have Virtual Machines reporting to our Log Analytic instance, it's time to make sure we are collecting as much data as we need to give suitable recommendations, luckily a script has already been included in the Azure Optimisation repository called 'Setup-LogAnalyticsWorkspaces.ps1' to configure the performance counters.

  1. Open PowerShell (or Windows Terminal)

  2. Change the directory to the location of the Setup-LogAnalyticsWorkspaces.ps1, in the root folder of the repository extracted earlier

  3. Run the following PowerShell commands to download the required PowerShell Modules:

    Install-Module -Name Az.ResourceGraph
    Install-Module -Name Az.OperationalInsights
  4. Then run: .\Setup-LogAnalyticsWorkspaces.ps1

  5. The script will then go through all Log Analytic workspaces that you have access to and check for performance counters.

  6. Windows PowerShell - \Setup-LogAnalyticsWorkspaces.ps1

  7. If they are missing from the Log Analytics workspace, then you can run:

    ./Setup-LogAnalyticsWorkspaces.ps1 -AutoFix

or

 #Fix specific workspaces configuration, using a custom counter collection frequency
./Setup-LogAnalyticsWorkspaces.ps1 -AutoFix -WorkspaceIds "d69e840a-2890-4451-b63c-bcfc5580b90f","961550b2-2c4a-481a-9559-ddf53de4b455" -IntervalSeconds 30
Setup Azure AD-based recommendations by granting permissions to Managed Identity.

Azure Optimization Engine, has the ability to do recommendations based on Microsoft Entra ID roles and permissions, but in order to do that, the System Assigned Identity of the Azure Optimization Engine account needs to be given 'Global Reader' rights. As part of the deployment, you may have gotten the following error:

Cannot bind argument to parameter 'DirectoryRoleId' because it is an empty string.

Could not grant role. If you want Azure AD-based recommendations, please grant the Global Reader role manually to the aoegeek-auto managed identity or, for previous versions of AOE, to the Run As Account principal.

We are going to grant the Azure Automation account 'Global Reader' rights manually in the Azure Portal.

  1. Open Azure Portal
  2. Navigate to Automation Accounts
  3. Open your Azure Optimisation Engine automation account
  4. Navigate down the navigation bar to the Account Settings section and select: Identity
  5. Azure Automation - Identity
  6. Copy the object ID
  7. Now navigate to Microsoft Entra ID
  8. Click on Roles and Administrators
  9. Search for: Global Reader
  10. Select Global Reader and select + Add assignments
  11. Paste in the object ID earlier, and click Ok to grant Global Reader rights to the Azure Automation identity.
Azure Automation - Runbooks & Automation

The wind that gives Azure Optimization Engine its lift is Azure Automation and Runbooks, at the time I deployed this - I had x1 Azure Automation account and 33 runbooks!

Looking at the runbooks deployed, you can get a sense of what Azure Optimization Engine is doing...

NAMETYPE
aoegeek-autoAutomation Account
Export-AADObjectsToBlobStorage (aoegeek-auto/Export-AADObjectsToBlobStorage)Runbook
Export-AdvisorRecommendationsToBlobStorage (aoegeek-auto/Export-AdvisorRecommendationsToBlobStorage)Runbook
Export-ARGAppGatewayPropertiesToBlobStorage (aoegeek-auto/Export-ARGAppGatewayPropertiesToBlobStorage)Runbook
Export-ARGAvailabilitySetPropertiesToBlobStorage (aoegeek-auto/Export-ARGAvailabilitySetPropertiesToBlobStorage)Runbook
Export-ARGLoadBalancerPropertiesToBlobStorage (aoegeek-auto/Export-ARGLoadBalancerPropertiesToBlobStorage)Runbook
Export-ARGManagedDisksPropertiesToBlobStorage (aoegeek-auto/Export-ARGManagedDisksPropertiesToBlobStorage)Runbook
Export-ARGNICPropertiesToBlobStorage (aoegeek-auto/Export-ARGNICPropertiesToBlobStorage)Runbook
Export-ARGNSGPropertiesToBlobStorage (aoegeek-auto/Export-ARGNSGPropertiesToBlobStorage)Runbook
Export-ARGPublicIpPropertiesToBlobStorage (aoegeek-auto/Export-ARGPublicIpPropertiesToBlobStorage)Runbook
Export-ARGResourceContainersPropertiesToBlobStorage (aoegeek-auto/Export-ARGResourceContainersPropertiesToBlobStorage)Runbook
Export-ARGUnmanagedDisksPropertiesToBlobStorage (aoegeek-auto/Export-ARGUnmanagedDisksPropertiesToBlobStorage)Runbook
Export-ARGVirtualMachinesPropertiesToBlobStorage (aoegeek-auto/Export-ARGVirtualMachinesPropertiesToBlobStorage)Runbook
Export-ARGVMSSPropertiesToBlobStorage (aoegeek-auto/Export-ARGVMSSPropertiesToBlobStorage)Runbook
Export-ARGVNetPropertiesToBlobStorage (aoegeek-auto/Export-ARGVNetPropertiesToBlobStorage)Runbook
Export-AzMonitorMetricsToBlobStorage (aoegeek-auto/Export-AzMonitorMetricsToBlobStorage)Runbook
Export-ConsumptionToBlobStorage (aoegeek-auto/Export-ConsumptionToBlobStorage)Runbook
Export-RBACAssignmentsToBlobStorage (aoegeek-auto/Export-RBACAssignmentsToBlobStorage)Runbook
Ingest-OptimizationCSVExportsToLogAnalytics (aoegeek-auto/Ingest-OptimizationCSVExportsToLogAnalytics)Runbook
Ingest-RecommendationsToSQLServer (aoegeek-auto/Ingest-RecommendationsToSQLServer)Runbook
Recommend-AADExpiringCredentialsToBlobStorage (aoegeek-auto/Recommend-AADExpiringCredentialsToBlobStorage)Runbook
Recommend-AdvisorAsIsToBlobStorage (aoegeek-auto/Recommend-AdvisorAsIsToBlobStorage)Runbook
Recommend-AdvisorCostAugmentedToBlobStorage (aoegeek-auto/Recommend-AdvisorCostAugmentedToBlobStorage)Runbook
Recommend-ARMOptimizationsToBlobStorage (aoegeek-auto/Recommend-ARMOptimizationsToBlobStorage)Runbook
Recommend-LongDeallocatedVmsToBlobStorage (aoegeek-auto/Recommend-LongDeallocatedVmsToBlobStorage)Runbook
Recommend-UnattachedDisksToBlobStorage (aoegeek-auto/Recommend-UnattachedDisksToBlobStorage)Runbook
Recommend-UnusedAppGWsToBlobStorage (aoegeek-auto/Recommend-UnusedAppGWsToBlobStorage)Runbook
Recommend-UnusedLoadBalancersToBlobStorage (aoegeek-auto/Recommend-UnusedLoadBalancersToBlobStorage)Runbook
Recommend-VMsHighAvailabilityToBlobStorage (aoegeek-auto/Recommend-VMsHighAvailabilityToBlobStorage)Runbook
Recommend-VMSSOptimizationsToBlobStorage (aoegeek-auto/Recommend-VMSSOptimizationsToBlobStorage)Runbook
Recommend-VNetOptimizationsToBlobStorage (aoegeek-auto/Recommend-VNetOptimizationsToBlobStorage)Runbook
Remediate-AdvisorRightSizeFiltered (aoegeek-auto/Remediate-AdvisorRightSizeFiltered)Runbook
Remediate-LongDeallocatedVMsFiltered (aoegeek-auto/Remediate-LongDeallocatedVMsFiltered)Runbook
Remediate-UnattachedDisksFiltered (aoegeek-auto/Remediate-UnattachedDisksFiltered)Runbook

A lot of the runbooks, such as the Log Analytics workspace ID, link up to Azure Automation variables, such as this period in Days to look back for Advisor recommendations, by default, this is '7' but you can change this variable to suit your organisation's needs.

Azure Automation - Runbooks & Automation

Azure Automation - Schedules

Along with containing the variables and configurations used by the Runbooks, it also contains the schedules for the ingest of data into the storage account and SQL databases, most of these are Daily, but schedules such as ingesting from the Azure Advisor are weekly, by default these times are in UTC.

Azure Automation - Schedules

When making changes to these schedules (or moving the Runbooks to be run from a Hybrid worker), it is recommended to use the Reset-AutomationSchedules.ps1 script. These times need to be in UTC.

Terminal - Reset-AutomationSchedules.ps1

Azure Automation - Credentials

When we set up the Azure SQL database earlier, as part of the Azure Optimisation setup, we configured the SQL Admin account and password, these credentials are stored and used by the Runbooks in the Azure Automation credential pane.

Azure Automation - Credentials

View Recommendations

It's worth noting, that because Azure Optimization Engine stores its data into Log Analytics and SQL, you can use languages such as KQL directly on the Log Analytics workspace to pull out any information you might need and develop integration into other toolsets.

Workbooks

There are x3 Azure Log Analytics workbooks included in the Azure Optimization Engine, these are as follows:

NAMETYPE
Resources InventoryAzure Workbook
Identities and RolesAzure Workbook
Costs GrowingAzure Workbook

They can be easily accessed in the Azure Portal.

  1. Log in to the Azure Portal
  2. Navigate to Log Analytics Workspace
  3. Click on the Log Analytics workspace you set up for Azure Optimization Engine earlier and click on Workbooks (under General).
  4. Click on: Workbooks filter at the top to display the 3 Azure Optimization Engine
  5. Log Analtics - Workbooks
  6. After a few days of collecting data, you should now be able to see data like below.
Resource Inventory - General

Resource Inventory - General

Resource Inventory - Virtual Machines

Resource Inventory - Virtual Machines

Resource Inventory - Virtual Machine ScaleSets

Resource Inventory - Virtual Machine ScaleSets

Resource Inventory - Virtual Machine ScaleSets Disks

Resource Inventory - Virtual Machine ScaleSets Disks

Resource Inventory - Virtual Networks

Resource Inventory - Virtual Networks

Identities and Roles - Overview

Identities and Roles - Overview

Power BI

The true power of the Azure Optimisation engine, is the data stored in the SQL database, using PowerBI you can pull the data into dashboards and make it more meaningful, and the recommendations given from PowerBI and SQL.

The Optimisation Engine already has a starter PowerBI file, which pulls data from the database.

Install PowerBI Desktop
  1. Open Microsoft Store and search for: Power BI Desktop
  2. Click Get
  3. Power BI Desktop
  4. Once Downloaded, click Open
Obtain Azure SQL Information

In order to connect PowerBI to the Azure SQL database, we need to know the URL of the database and make sure our IP has been opened on the Azure SQL Firewall.

  1. Open Azure Portal
  2. Navigate to SQL Servers
  3. Click on the SQL server created earlier, under the Security heading click on Firewall and Virtual Networks
  4. Under: Client IP address, make sure your public IP is added and click Save
  5. Azure SQL - Virtual Network
  6. Now that we have verified/added our client IP, we need to get the SQL database (not server) URL
  7. Click on Overview
  8. Click on the aoeoptimization database (under Available resources, down the bottom)
  9. Click on Copy to Clipboard for the server Name/URL
  10. Azure SQL - Database URL
Open PowerBI Desktop File

Now that we have PowerBI Desktop installed, it's time to open: AzureOptimizationEngine.pbix. This PowerBI file is located in the Views folder of the Azure Optimization Engine repository.

  1. Open: AzureOptimizationEngine.pbix in PowerBI Desktop
  2. On the Home page ribbon, click on Transform Data
  3. Click Data source settings
  4. Click Change Source
  5. Change the default SQL server of aoedevgithub-sql.database.windows.net to your SQL database, copied earlier.
  6. Click Ok
  7. Click Ok and press Apply Changes
  8. It will prompt for credentials, click on Database
  9. Enter in your SQLAdmin details entered as part of the Azure Optimization Engine setup
  10. Click Connect

After PowerBI updates its database and queries, your PowerBI report should now be populated with data like below.

PowerBI - Overview

PowerBI - Overview

PowerBI - Cost

PowerBI - Cost

PowerBI - High Availability

PowerBI - High Availability

PowerBI - Security

PowerBI - Security

PowerBI - Operational Excellence

PowerBI - Operational Excellence

Congratulations! You have now successfully stood up and configured Azure Optimization Engine!

Setup Azure Cloud Shell

· 3 min read

The Azure Cloud Shell allows connectivity to Microsoft Azure using an authenticated, browser-based shell experience that’s hosted in the cloud and accessible from virtually anywhere as long as you have access to your favourite browser!

Azure Cloud Shell is assigned per unique user account and automatically authenticated with each session.

Get a modern command-line experience from multiple access points, including the Azure portal, shell.azure.com, Azure mobile app, Azure docs_(e.g.Azure CLI,_ Azure PowerShell), and VS Code Azure Account extension.

Both Bash and PowerShell experiences are available.

Microsoft routinely maintains and updates Cloud Shell, which comes equipped with commonly used CLI tools including Linux shell interpreters, PowerShell modules, Azure tools, text editors, source control, build tools, container tools, database tools, and more. Cloud Shell also includes language support for several popular programming languages such as Node.js, .NET, and Python.

Along with native tools such as Azure PowerShell, it also contains Terraform, allowing you to implement and test functionality such as Infrastructure as Code, without needing to touch your local machine and is always up-to-date, a full list of tools can be found 'here'.

Just some noticeable things to be aware of regarding the Azure Cloud Shell:

  • Cloud Shell runs on a temporary host provided on a per-session, per-user basis
  • Cloud Shell times out after 20 minutes without interactive activity
  • Cloud Shell requires an Azure file share to be mounted
  • Cloud Shell uses the same Azure file share for both Bash and PowerShell
  • Cloud Shell is assigned one machine per user account
  • Cloud Shell persists $HOME using a 5-GB image held in your file share
  • Permissions are set as a regular Linux user in Bash

The Azure Cloud Shell is very easy to set up and get going, but in this article, I will show you the additional configuration options you have available, such as selecting your own storage account, region and resource group to conform to any naming policies and preferences you may have.

By default, CloudShell creates a new Resource Group, Storage account, Fileshare in the Southeast Asia region.

To set up Azure Cloud Shell

  1. Navigate to the Azure Portal
  2. Click on the little Terminal/PowerShell icon in the upper navigation bar
  3. Azure Portal - Cloud Shell
  4. You should get notified, "You have no storage mounted" click on Show advanced settings
  5. Azure Portal - Cloud Shell
  6. Here you can easily create your CloudShell storage account with your own preferences:
  • The subscription
  • Region
  • Resource Group (new or existing)
  • Storage account (new or existing)
  • Fileshare (new or existing)
  1. Azure Portal - Cloud Shell Storage Account
  2. Click on Create Storage when you are ready to start the verification (which happens after you click Create storage, don't worry as long as you have the window open you can make any additional changes) and deployment.
  3. Azure Portal - Cloud Shell

Using this method is handy when you have more than one person administering the subscription, each person can have its own file share, which can then be backed up using Azure Backup and easily removed later when not needed.

Native RDP Client & Azure Bastion

· 4 min read

In early February 2022, Azure Bastion Preview support for the native Windows SSH and RDP client came out, and this meant that we no longer have to rely on the Azure Portal and the limitations of a web browser - the support also includes File transfer through the clipboard by copying and pasted into the RDP session!

Azure Bastion is a fully managed service that provides more secure and seamless Remote Desktop Protocol (RDP) and Secure Shell Protocol (SSH) access to virtual machines (VMs) without any exposure through public IP addresses. Provision the service directly in your local or peered virtual network to get support for all the VMs within it.

Let’s test the native RDP client through a secure connection using Azure Bastion!

Prerequisites

  • This configuration requires the Standard tier for Azure Bastion.
  • A Virtual Machine(s) to connect
  • Latest Azure CLI
  • Reader role on the Virtual Machine
  • Read role on the Network Interface Card of the Virtual Machine.
  • Reader role on the Azure Bastion resource
  • Virtual Machine Administrator (or User) login role using Microsoft Entra IDauthentication.

Create Azure Bastion

If you have a Virtual Machine but haven't set up Azure Bastion, run through the below to set it up:

  1. Log in to the Azure Portal
  2. Click on Create a resource
  3. Search for: Bastion Azure - Bastion
  4. Click Create
  5. This is a Networking resource to place it in the same Resource Group as my Virtual Network.
  6. Please type in a Name for the Bastion instance; I will call mine: Bastion
  7. Select the Region that matches the Virtual Network region
  8. Select Standard Tier
  9. Select the Virtual Network
  10. It now warns you about creating an: AzureBastionSubnet with a prefix of at least /27, so we need to create one; click on Manage Subnet Configuration.
  11. Click + Subnet
  12. For the Name type in: AzureBastionSubnet
  13. For the Subnet address range: 10.0.1.0/27, If you get an error that indicates the address is overlapping with another subnet, it may be because the Address space is only a /24; click Cancel and click on Address Space in the Virtual Network and change the /24 to/16 to increase the address range.
  14. Click Save to create the subnet Azure - Bastion
  15. Up the Top, click Create a Bastion. Your subnet should be selected automatically to go back to the Bastion setup.
  16. You do need a Public IP for Bastion, so confirm the name is appropriate, then click Next: Tags.
  17. Azure Bastion
  18. Add in appropriate tags, then click Next: Advanced
  19. Check the box next to Native client support (Preview)
  20. Azure Bastion
  21. Click Next: Review + Create
  22. Click on Create to create your Bastion instance!

Note: Bastion may take 10-20 minutes to provision.

Check Bastion SKU

If you already have an Azure Bastion instance, let's check the SKU and, if needed, change it to Standard. Just a note:

Downgrading from a Standard SKU to a Basic SKU is not supported. To downgrade, you must delete and recreate Azure Bastion.

  1. Log in to the Azure Portal
  2. Navigate to your Bastion resource
  3. Click on: Configuration
  4. Change Tier to Standard
  5. Check: Native client support (Preview)
  6. Click Apply
  7. Azure Bastion

Connect to VM using Native RDP Support

  1. Open command prompt or Terminal

  2. Type: az login

  3. Login to your Azure subscription

  4. We need the resource ID of the VM we need to connect to, type in: az VM show --resource-group 'appserver-rg' --name 'APP-P01' --show-details

  5. Change the resource group and VM name above to match your VM

  6. Copy the id of the Virtual Machine you want to connect to

  7. Because I am running the Azure CLI from a PowerShell terminal, I am going to use the following variables:

    $BastionName = 'Bastion'
    $BastionRG = 'network-rg'
    $VMResourceID= '/subscriptions/000000-0000-0000-0000000/resourceGroups/appserver-rg/providers/Microsoft.Compute/virtualMachines/APP-P01'
    az network bastion rdp --name $BastionName --resource-group $BastionRG --target-resource-id $VMResourceID
  8. Run the command, your Remote Desktop window should open up, and the tunnel has been established; if you close the Azure CLI window, your RDP session will be dropped!

  9. Azure Bastion

As you could most likely tell, there are no options to enable drive passthrough, etc. You would usually find when connecting to Remote Desktop, but the copying of files/text, etc., does work!

AzureFeeds

· 2 min read

Over the past few months, I have been busy working on a new project...

A news aggregator for Azure news and updates, I tested some desire for this using AzureFeeds, on LinkedIn as a platform (which will continue)...

I didn't want my LinkedIn connections, to be spammed by Azure updates every day, when some connections were connected to me for other non-Azure related services, so I created a separate account that interested people could subscribe to AzureFeeds.

I was lucky enough to acquire both https://azureupdates.com and https://azurefeeds.com/ and wanted to do something a bit more substantial than just forwarding to the relevant Microsoft pages.

Inspired by PlanetPowerShell, I wanted an Azure specific service, so I created the AzureFeeds website.

Originally starting using official Microsoft services, due to the massive amount of updates and changes ongoing, I didn't want to make the feeds too busy, however with feedback from the community through a Twitter poll, I then added an author section and added community-based content to supplement official updates, mimicking the functionality of PlanetPowerShell being content community-driven a lot more.

You can subscribe to a single feed by clicking 'Download Feed' using your favourite RSS reader (even Outlook) and get the content delivered straight to you from official and community updates! Allowing you to keep up with Azure Updates and Azure community-driven updates easily, or just browse the webpage and filter by Author or Category for content that you may be after.

Check out.. https://azurefeeds.com/

Azure Feeds

Datto Remote Management Azure VM Application Deployment

· 14 min read

The Azure Compute Gallery (superseded the Shared Image Gallery) offers more than just Azure Image management and replication, and you can deploy Applications to your Virtual Machines.

Overview

An Azure Compute Gallery helps you build structure and organization around your Azure resources, like images and applications. An Azure Compute Gallery provides:

  • Global replication.
  • Versioning and grouping of resources for easier management.
  • Highly available resources with Zone Redundant Storage (ZRS) accounts in regions that support Availability Zones. ZRS offers better resilience against zonal failures.
  • Premium storage support (Premium_LRS).
  • Sharing across subscriptions, and even between Active Directory (AD) tenants, using Azure RBAC.
  • Scaling your deployments with resource replicas in each region.

With images, Azure VM applications that support both Linux and Windows operating systems get these benefits.

While you can create an image of a VM with apps pre-installed, you would need to update your image each time you have application changes. Separating your application installation from your VM images means there’s no need to publish a new image for every line of code change.

Application packages provide benefits over other deployment and packaging methods:

  • Grouping and versioning of your packages
  • VM applications can be globally replicated to be closer to your infrastructure, so you don’t need to use AzCopy or other storage copy mechanisms to copy the bits across Azure regions.
  • Sharing with other users through Azure Role Based Access Control (RBAC)
  • Support for virtual machines, and both flexible and uniform scale sets
  • If you have Network Security Group (NSG) rules applied on your VM or scale set, downloading the packages from an internet repository might not be possible. And with storage accounts, downloading packages onto locked-down VMs would require setting up private links.
  • VM applications can be used with the DeployIfNotExists policy.

Azure VM Application packages (stored in an Azure Storage account) uses multiple resources, as below:

ResourceDescription
Azure compute galleryA gallery is a repository for managing and sharing application packages. Users can share the gallery resource and all the child resources will be shared automatically. The gallery name must be unique per subscription. For example, you may have one gallery to store all your OS images and another gallery to store all your VM applications.
VM applicationThis is the definition of your VM application. This is a logical resource that stores the common metadata for all the versions under it. For example, you may have an application definition for Apache Tomcat and have multiple versions within it.
VM Application versionThis is the deployable resource. You can globally replicate your VM application versions to target regions closer to your VM infrastructure. The VM Application Version must be replicated to a region before it may be deployed on a VM in that region.

There is no extra charge for using VM Application Packages, but you will be charged for the following resources:

  • Storage costs of storing each package and any replicas.
  • Network egress charges for replication of the first image version from the source region to the replicated regions. Subsequent replicas are handled within the region, so there are no additional charges.

Before we deploy our first VM application, there are a few things we need to be aware of:

  • VM Application requires an Azure Compute Gallery
  • VM Application requires an Azure storage account to store your applications
  • The VM Application gets downloaded to the VM using the name of the VM application (not the actual name and Extension of your file in the storage account)
  • Currently, in order to retry a failed installation, you need to remove the application from the profile and add it back
  • No more than five applications per Virtual Machine deployed at a time
  • The maximum size of the application is 1 GB
  • You can't have multiple versions of the same application installed on a Virtual Machine, and a newer version will supersede an older version either via an upgrade command or complete reinstall.

In this article, we are going to deploy the Datto Remote Management & Monitoring Agent to a Windows Server 2022 Virtual Machine; this agent is a simple executable that installs on a virtual machine and allows remote access and management of a virtual machine, without requiring any other form of connectivity (Azure Bastion, RDP via Public IP, Site to Site VPN etc.) for an MSP (Managed Service Provider) using the Datto toolset, the same concept can be applied to any application (theoretically you can also use this to run PowerShell installs or chocolatey installs).

It's worth noting the VM Applications are currently in Public Preview, there is a good chance there will be changes in the way these operate and are configured when it becomes Generally Available.

Setup Azure VM Application Deployment

Prerequisites

In order to use VM Applications, we need:

  • A storage account
  • Azure Compute gallery
  • VM application definition and version (in my example: the Datto RMM agent)

Following the guide, we will run through the creation of everything from scratch; I am, however, assuming you already have the executable or application package and know the instructions to install/uninstall it - as each application is different. The MicrosoftVM Applications docs give a few good examples for getting started with various applications.

Setup Storage Account

The Storage account is where your application will be placed; it uses blobs; depending on the importance of your application deployments, you may want to go for geo-replication etc., but in this example, I will be going with a locally redundant, StorageV2 general-purpose account.

  1. Open the Azure Portal
  2. Click on + Create a Resource
  3. Search for: Storage account, and select it
  4. Click Create
  5. Select your subscription
  6. Select a Resource Group for your storage account, or create a new one
  7. Enter your storage account name (this needs to be globally unique)
  8. Select your region that your application will be in; although the application can be replicated to other regions, it's better to select your primary region here.
  9. Select the performance and redundancy to match your requirements and click Next: Advanced
  10. Azure Portal - Create Storage Account
  11. You can leave most settings here as default, the application executable will need to be able to be accessed directly; make sure the Minimum TLS is at least 1.2.
  12. You don't need hierarchical namespace etc.; unselect 'Allow cross-tenant replication' unless this is a feature you use.
  13. Azure Portal - Create Storage Account
  14. Click Review + Create to skip to the last blade; most defaults are fine, but if you want to adjust the blob retainment and soft delete settings, go to the Data Protection tab, set them, then review your Configuration and select Create.
  15. Go back to your storage account and click Configuration
  16. Make sure: Allow storage account key access is: Enabled; if it is not, select Enabled and click Save.

Now that we have the Storage account to store your application binaries, we now need an Azure Compute Gallery (previously the Shared Image Gallery) to keep your application definition and version metadata.

  1. Open the Azure Portal
  2. Click on + Create a Resource
  3. Search for: Azure Compute Gallery and select it
  4. Click Create
  5. Select your subscription and resource group (in this case, I am going to use the same resource group as the Storage account I created earlier)
  6. Type in a name, and select your region
  7. Although not mandatory, use the opportunity to fill in a description for the purpose of the Compute Gallery for future reference
  8. Azure Portal - Create Storage Account
  9. Select Review + Create
  10. Verify everything is correct and click on: Create

Create Application Definition

VM application definitions are created within a gallery and carry information about the application and requirements for using it internally. This includes the operating system type for the VM application versions contained within the application definition. The name of your Application definition defines the name of the file that will be downloaded to your virtual machines.

  1. Open the Azure Portal
  2. Navigate to 'All Resources'
  3. Find and click on your Azure Compute Gallery you created earlier
  4. On the overview pane, select + Add
  5. Click on +VM application definition
  6. Your subscription and resource group should be automatically selected to the location of the Compute Gallery, type in the name of your applicatio.n
  7. Select your region
  8. Select the OS type - in my case, and I select Windows
  9. Azure Portal - Create Application Definition
  10. Click Next: Publishing Options
  11. The following fields are not mandatory, but I recommend filling in areas to help report on and manage your applications.
    • Description
    • End of life date
    • Eula link
    • Privacy URI
    • Release notes URI
  12. Azure Portal - Create Metadata
  13. Click Review + create
  14. Verify your Configuration and select Create

Create Application version

Now that we have the application definition setup, it's time to set up the version and upload our binary file.+

  1. Open the Azure Portal

  2. Navigate to 'All Resources'

  3. Find and click on your Azure Compute Gallery you created earlier

  4. Click on Definitions(besides the Get Started link)

  5. Select your Application definition

  6. Click on: +Add

  7. Enter in your version number, and this will increment and grow as you adjust and troubleshoot your application; I recommend starting with 0.0.1 then working your way up, with 1.0.0 being potentially your final/production-ready releast.

  8. Select your Region

  9. Now we need to select our source application package (you can enter in your blob URL if you know it); we haven't uploaded it to our storage account yet, so we will select Browse

  10. Select your Storage account

  11. Press + Container

  12. Enter in the name of your container ( it has to be in lowercase), such as the application name (to keep things separate, consider a container per application)

  13. Press Upload

  14. Browse to your file and select it

  15. Expand Advanced

  16. Make sure that Blob type is: Blob

  17. Azure Portal - Azure Blob

  18. Click Upload

  19. Select your newly uploaded file and click Select

  20. Note: You can only upload one file as part of your package, you can upload a ZIP file and have your Install script extract it

  21. The Install script is the command to install to your application, by default windows applications are set to install cmd. This already knows the directory your files are in because the file will be uploaded as the application name (i.e. DattoRMM), it needs to be renamed to include .exe and then ran, I will switch to PowerShell for the Install script, so will enter:

    powershell.exe -command "Rename-Item '.\DattoRMM' -NewName 'DattoRMM.exe'; Start-Process '.\DattoRMM.exe'"
  22. If you have a script to uninstall the application, enter it (in my case, I am just going to put a '.' to skip this, as I don't currently have an uninstall script developed)

  23. The rest of the Configuration isn't mandatory; the Update script is used by Azure when a new version of an application is created; by default, the Azure VM extension will treat an upgrade like a completely new install and run the install steps unless an update script is defined.

  24. Azure Portal - Application Version

  25. Click Next: Replication

  26. Like Azure Compute Images, you can replicate your Azure VM applications across multiple regions (depending on where your workloads are), such as Australia East to West Europe, and store it then Zone Redundant or Local storage. In my example, I am going to leave mine as one replica in Australia East on locally-redundant storage and click Review + create

  27. Verify everything looks ok and click Create to create your application version! This may take a few minutes to create, depending on your configuration and replication.

Deploy Azure VM Application

Deploy Azure VM Application to Virtual Machines using the Azure Portal

Now that your Azure VM Application has been created, it is now time to deploy to a Virtual Machine. I have a Windows Server 2022 Datacenter Azure Gen 2 VM running as a Standard_B2ms as my test machine, and because I am going to use the Datto RMM agent to connect to the machine, I don't need any RDP ports open etc.

  1. Open the Azure Portal
  2. Navigate to Virtual Machines
  3. Click on your Virtual Machine
  4. Under Settings, click Extensions + Applications
  5. Click VM Applications
  6. Click + Add application
  7. Select your application (note you can select a particular version, by default, it is the latest)
  8. Click Ok
  9. Azure Portal - VM-P01
  10. You can select your Install Order (i.e. if you had multiple applications, you can select which one installs 1st, 2nd, third and so on); I will select No Reference and click Save to start the deployment.
  11. If you click Extensions, you should see that a: VMAppExtension has started to be installed; click on Refresh to update the status and click on the Extension to a more detailed status message, hopefully you see ":Operational Install is SUCCESS"
  12. My Virtual Machine has now had the Datto Remote Management agent installed successfully and has appeared in the portal for me to connect to!
  13. Azure - Datto RMM

Deploy Azure VM Application to Multiple Virtual Machines using PowerShell

I've created the PowerShell script below to deploy an application to multiple Virtual Machines at once, it can easily be adjusted for a PowerShell Runbook that runs periodically to install software on machines it may be missing. As usual, please make sure you test and run any PowerShell scripts first in a demo environment.

$allvms = Get-AzVM
$applicationname = 'DattoRMM'
$galleryname = 'AzComputeGallery'
$galleryrg = 'vmapps-prod-rg'
$appversion = '0.0.1'


try
{
ForEach ($vm in $allvms)

{
$AzVM = Get-AzVM -ResourceGroupName $vm.ResourceGroupName -Name $vm.Name
$appversion = Get-AzGalleryApplicationVersion `
-GalleryApplicationName $applicationname `
-GalleryName $galleryname `
-Name $appversion `
-ResourceGroupName $galleryrg
$packageid = $appversion.Id
$app = New-AzVmGalleryApplication -PackageReferenceId $packageid
Add-AzVmGalleryApplication -VM $AzVM -GalleryApplication $app
Update-AzVM -ResourceGroupName $vm.ResourceGroupName -VM $AzVM -ErrorAction Stop
}
}

catch [Microsoft.Azure.Commands.Compute.Common.ComputeCloudException]
{
#Most likely failed due to duplicate package ID/identical version
[Management.Automation.ErrorRecord]$e = $_

$info = [PSCustomObject]@{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
}

$info
}

Troubleshooting VM Application

If you have problems installing a package, just a reminder that the VM Application, uploads your file based on the name of the Application, to the server and needs to be renamed with a valid extension as part of the install script.

Package Location

The package/extension location is here:

  • C:\Packages\Plugins\Microsoft.CPlat.Core.VMApplicationManagerWindows\{VERSION#}\

You will find your Application binary under Downloads.

Logs

For the extension status logs, navigate to:

  • C:\Packages\Plugins\Microsoft.CPlat.Core.VMApplicationManagerWindows\{VERSION#}\Status

You should see files such as:

  • 0.status

You can right-click these and open them in Notepad, and you should have the timestamp and the last status message, this should be identical to what you see in the Azure Portal.

For the application install logs, navigate to:

  • C:\Packages\Plugins\Microsoft.CPlat.Core.VMApplicationManagerWindows\{VERSION#}\Downloads\{APPNAME}\{APPVERSION}\

You may see files such as:

  • stderr
  • stdout

You can right-click these and open them in Notepad, any errors will be noted in these.

Troubleshooting during preview