Skip to main content

Azure Private DNS Resolver and Azure Point to Site VPN

· 8 min read

You might access resources such as Azure SQL databases or Azure Storage accounts if you're connecting to a Microsoft Azure network externally (from a non-Azure VM or VPN); mainly if you operate Cloud-only services and don't have an external DNS provider, such as Active Directory - connecting to private link resources, you may have to edit your local host's file and override local DNS to point to the IP of the private endpoint for each service.

This is not sustainable, not scalable, and you might end up throwing your hands in the air and switching back to public-facing services and just whitelisting other users' Public IPs to gain access to Azure resources - which can lead to its own set of issues, such as unmanaged IPs left with access to resources after contractors or users leave or have finished their work, IP address changes if not managed correctly can allow any user or company to have a direct line of sight to your company resources.


Today we will concentrate on DNS resolution of Private Endpoints, using Azure DNS Private Resolver as a DNS proxy when connecting to Azure using a Point to Site VPN.

For this article, I assume you have an Azure Point to Site already set up; if you don't, you can refer to a previous article I wrote for Creating an Azure Point to Site VPN using Microsoft Entra ID authentication.

Disclaimer: Azure Private DNS Resolver is still in Public Preview at the time of this article (02/09/2022). If you aim to use this in a Production scenario, functionality and services may change. This also means there are current regional restrictions, and Azure Private DNS Resolver is not currently available in all regions. Also bear in mind the cost of this service.

So what is Azure DNS Resolver?

Azure DNS private resolver is a cloud-native, highly available, and DevOps-friendly service. It provides a simple, zero-maintenance, reliable, and secure Domain Name System (DNS) service to resolve and conditionally forward DNS queries from a virtual network, on-premises, and to other target DNS servers without the need to create and manage a custom DNS solution. Resolve DNS names hosted in Azure Private DNS Zones from on-premises networks as well as DNS queries for your own domain names. This will make your DNS infrastructure work privately and seamlessly across on-premises networks and enable key hybrid networking scenarios.

Azure Private DNS Resolver

Customers will no longer need to provision IaaS-based solutions on their virtual networks to resolve names registered on Azure Private DNS Zones and can do conditional forwarding of domains back to on-premises, across multi-cloud providers, and public DNS servers.

This solution can work with your Azure ExpressRoute, Azure VPN, or Azure Bastion setup.

Azure Private DNS Resolver

Inbound or Outbound?

Name resolution queries for Azure workloads from the on-premises network are conditionally forwarded to the Azure DNS private resolver inbound endpoint, which enables you to perform name resolution of workloads registered on Azure Private DNS Zones from on-premises.

Inbound EndpointAzure DNS private resolver inbound endpoint that receives the name resolution request from Azure & on-premises network and resolve names.
Outbound EndpointAzure DNS private resolver outbound endpoint conditionally forwards the request to on-premises or other target DNS servers.

The Azure DNS private resolver inbound endpoint has a private IP that is part of a subnet where the endpoint has been created. The IP address of the DNS private resolver inbound endpoint is then set as a DNS server on the on-premises network.

Azure DNS private resolver outbound endpoint conditionally forwards the request to on-premises or other target DNS servers.

Today, we will connect to private endpoints to concentrate on the Inbound functionality of Azure Private DNS Resolver.


To deploy Azure Private DNS Resolver, we will need a few things.

  • A Virtual Network
  • A subnet dedicated to resolving DNS queries (/28)
  • A private endpoint (i.e. Storage Account, SQL Database) is linked to the virtual network.

Deploy DNS Private Resolver

I assume you already have a Virtual Network tied to your Virtual Network gateway as part of the 'Point to Site VPN' setup.

  1. Open the Azure Portal
  2. Click on + Create a resource
  3. Type in: DNS Private Resolver
  4. Azure DNS Private Resolver
  5. Click Create
  6. Select your Subscription
  7. Select your Resource Group (I recommend placing the DNS Private Resolver in the same resource group as your Virtual Network - but place this service in a Resource Group that makes sense for your environment, i.e. shared services or a specific network team resource group)
  8. Type in a name for your DNS Private Resolver (this is a regional service, but the name does not need a globally unique )
  9. Select your region (this needs to be the same region as the Virtual Network)
  10. Select your Virtual Network (the same Virtual Network that has your Virtual Network Gateway for the Point to Site VPN and your Private endpoints)
  11. Create Azure Private DNS Resolver
  12. Click Next: Inbound Endpoints >
  13. Now its time to add our Inbound Endpoint and create the Private DNS Resolver Subnet, click + Add an endpoint
  14. Type in your endpoint name (for example, InboundEndpoint)
  15. If you have already created a subnet, select it - else, click Create New
  16. Enter in your subnet name and address range
  17. Click Save
  18. Private DNS Resolver create subnet
  19. Click Review + Create
  20. Click Create

Adjust Point to Site DNS

Now that the DNS Resolver has been created, with an inbound endpoint, allowing the lookup of private endpoints, we need to add the Private Resolver DNS relay to our point-to-site VPN configuration; first, we need the newly created private IP of the inbound endpoint.

  1. Navigate to the DNS Private Resolver in the Azure Portal

  2. Open your DNS Private Resolver service

  3. Click on Inbound Endpoints

  4. Make a note of the private IP of your inbound endpoint.

  5. Private DNS Resolver

  6. Now that the Private Inbound resolver has been configured, we need to add the DNS relay into our Azure VPN configuration so that our DNS queries will respond with a private endpoint; you will need to modify the 'azurevpnconfig.xml' file and reimport the VPN.

  7. Right-click 'azurevpnconfig.xml' and edit in Notepad or Visual Studio Code

  8. Under:

  9. Add (replace the IP listed below with the IP of your Inbound endpoint copied earlier):

  10. Save and reimport to the Azure VPN Client

  11. Once connected, ping a resource behind a private endpoint, and you should get the private IP of that resource back and should be able to connect to that resource privately.

  12. Azure Private DNS Resolver ping

Any future or current private endpoints linked to the same Virtual Network will instantly be accessible without additional changes on the Azure VPN client. If you have a Hub & Spoke topology, then you may place the DNS Private Resolver in the HUB, then use forwarding rules to link to other peered VNETs.

Additional resources

The third-party resources below include reading and learning about the Azure Private DNS Resolver.

  • Quickstart: Create an Azure private DNS Resolver using the Azure portal

  • Intro to Azure DNS Private Resolver

  • Azure DNS Private Resolver - MicroHack

  • My Azure Private DNS Resolver Bicep export for reference:

      param dnsResolvers_PrivateDNSResolver_name string = 'PrivateDNSResolver'
    param virtualNetworks_vnettest_externalid string = '/subscriptions/57627713-eff2-44fa-a546-a2c8fde3c6e3/resourceGroups/pointtositetest/providers/Microsoft.Network/virtualNetworks/vnettest'

    resource dnsResolvers_PrivateDNSResolver_name_resource 'Microsoft.Network/dnsResolvers@2020-04-01-preview' = {
    name: dnsResolvers_PrivateDNSResolver_name
    location: 'australiaeast'
    properties: {
    virtualNetwork: {
    id: virtualNetworks_vnettest_externalid
    resource dnsResolvers_PrivateDNSResolver_name_InboundEndpoint 'Microsoft.Network/dnsResolvers/inboundEndpoints@2020-04-01-preview' = {
    parent: dnsResolvers_PrivateDNSResolver_name_resource
    name: 'InboundEndpoint'
    location: 'australiaeast'

IMS Payroll not opening as a published application in Azure Virtual Desktop

· 3 min read

Azure Virtual Desktop allows you to access an entire desktop or a published application with shortcuts and an appearance like it was running locally; depending on the requirements; I prefer published applications where possible to keep the user experience on the endpoint device and keep the cost down.

One of the applications I published for a customer is MYOB IMS Payroll.

IMS Payroll worked well as a published application for months until one day; it didn't seem to open for the user, whether as a published application or in the Full Desktop.

The symptoms were that once the user clicked on the icon, it would appear to open (visible on the Taskbar), but there was no window, and when you hovered over the preview thumbnail, it was blank. The cursor also appeared to be active with a circle, indicating it was trying to open.

Even if you don't have IMS Payroll, you may experience applications with a similar experience, and hopefully, this article will help point you in the right direction.

Azure Virtual Desktop - Published Application

One noticeable difference we found in our testing - was that it opened for us and other users using different accounts.

After some discovery, we discovered that the user had gone to another branch office site and used a different monitor setup, and IMS Payroll was out of drawing range. Usually, windows would be able to snap this back into view; however, after comparing the registry keys for our user vs the user who had the issue, we discovered that IMS Payroll sets the location in the user registry.

  • Registry Key location: \HKEY_CURRENT_USER\IMS Payroll Partner\Layout

In our case, the settings were as follows:

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\IMS Payroll Partner\Layout]

For the users who couldn't see IMS Payroll, their settings looked more like this:

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\IMS Payroll Partner\Layout]

The difference was that the Left entry had moved the Window too far, left out of view, so it could not be seen by the user when opening as a published app or on a Desktop.

After the Left entry was changed from -1444 to 684. IMS became visible again as a published application and on the Full Desktop.

Due to the hard-coded user registry entries, this specific issue would have occurred regardless of Azure Virtual Desktop, running in a Terminal Services environment, or even locally, when working with different monitor setups.

Note: Some applications may have configuration files stored in the user's AppData folders instead of the registry; if in doubt, raise a support ticket with the application vendor.

Microsoft Azure - Operational Cost Optimization Tasks

· 16 min read

The Microsoft Azure platform is not a set-and-forget ecosystem like doing service on your car!

There are no one-size-fits when it comes to cost optimization, but some general tasks can be done or considered on a Monthly/Quarterly/Annual basis to keep on top of the resources you are running in Azure and to keep them lean.


Although Microsoft takes a lot of traditional infrastructure management and security concerns off your hand, you are still responsible for the spending and ensuring the value of the technologies and services you consume match your business goals and agility.

Today we are going to go back to basics and look at the Cost Optimization pillar of the Microsoft Well-Architected Framework.

“The cost optimization pillar provides principles for balancing business goals with budget justification to create a cost-effective workload while avoiding capital-intensive solutions. Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies.”

“Use the pay-as-you-go strategy for your architecture, and invest in scaling out, rather than delivering a large investment-first version. Consider opportunity costs in your architecture and the balance between first-mover advantage versus fast follow.”

The right governance and oversight can help prevent Cloud sprawl and wasted consumption costs.

To help get you started, I have put together a list of some optimization opportunities, that should be run regularly, items such as reviewing unassociated public IPs should be done Monthly (along with Azure Advisor checks), and Azure Reservation reviews at least quarterly.

This is not an exhaustive list, and the use of Azure Policy and Azure Advisor help supplement these tasks.

If you have other tasks that you run, feel free to share them with the community in the page comments below.

The Microsoft Graph and KQL queries can also be used in conjunction with PowerShell to pull recommendations straight out of Advisor, which can then be fed into reports, and the use of community tools such as the Azure Optimization Engine cannot be undervalued.


Azure - Monitor &amp; optimize

Keep within the cost constraints

Every design choice has cost implications. Before choosing an architectural pattern, Azure service, or a price model for the service, consider the budget constraints set by the company. As part of the design, identify acceptable boundaries on scale, redundancy, and performance against cost. After estimating the initial cost, set budgets and alerts at different scopes to measure the cost.

One of the cost drivers can be unrestricted resources. These resources typically need to scale and consume more cost to meet demand.

Aim for scalable costs

A key benefit of the cloud is the ability to scale dynamically. The workload cost should scale linearly with demand.

You can save costs through automatic scaling. First, consider the usage metrics and performance to determine the number of instances. Then, choose smaller instances for a highly variable workload and scale out to get the required level of performance rather than up. This choice will enable you to make your cost calculations and estimates granular.

Pay for the consumption

Adopt a leasing model instead of owning infrastructure. Azure offers many SaaS and PaaS resources that simplify the overall architecture. The cost of hardware, software, development, operations, security, and data centre space is included in the pricing model. Also, choose pay-as-you-go over fixed pricing. That way, you're charged for only what you use as a consumer.

Right resources, the right size

Choose the right resources aligned with business goals and can handle the workload's performance.

An inappropriate or misconfigured service can impact the cost.

For example, building a multi-region service when the service levels don't require high availability or geo-redundancy will increase cost without any reasonable business justification. Specific infrastructure resources are delivered as fix-sized building blocks. Ensure that these blocks are adequately sized to meet capacity demand and deliver expected outcomes.

Monitor and optimize

Treat cost monitoring and optimization as a process rather than a point-in-time activity. Conduct regular cost reviews and measure and forecast the capacity needs so that you can provision resources dynamically and scale with demand. Review the cost management recommendations and take action.

Today, we will focus on Monitor and optimize.

Review Underutilized Resources

Optimize and improve efficiency by identifying idle and underutilized resources across the Azure ecosystem.

Review Azure App Service Plans

Review Azure App Service Plans to determine if:

  1. The Azure App Service Plan is ‘Standard’ or ‘Premium’ pricing and has an associated application.
  2. If the Azure App Service is getting utilized (by looking at the Metrics/CPU) and doesn’t need to be downscaled to a smaller plan.
Review shutdown workloads

Because you pay for Azure Resources as ‘Pay As You Go’, a quick win can be to review Virtual Machines to determine if the workload needs to be 24/7!

For example, you have automation configured to automatically start up and shut down workloads based on the following schedule: 7 AM Start – 7 PM Stop (& off Weekends).

You can add servers to this automated schedule by adding the following Tag to the Virtual Machine or trigger automation when a workload is ‘Shutdown’ and not deallocated; see my article on "Turn on an Azure Virtual Machine using Azure Automation" for a potential place to start.

Review Azure Advisor

The Azure Advisor is an inbuilt tool critical to optimizing the Azure Environment. The Azure Advisor needs to be reviewed for Cost recommendations.

  1. The Azure Advisor will recommend Reserved Instances.
  2. The Azure Advisor will recommend if a Virtual Machine runs on a VM size GREATER than what it needs (based on CPU utilization under 5% in the last 14 days). If the Azure Advisor reports an overprovisioned machine, you need to investigate its use and resize it to a more suitable size.
Review Azure SQL Databases

Review Azure SQL Databases to determine if:

  1. The SQL Database Pricing Tier is ‘Standard’ and uses the DTUs (usually found by looking at the Compute utilization on the databases); if not, downsize the DTU limit.
  2. Check Geo-Replication to ensure that the SQL Database is not replicating across Regions if it doesn’t need to be.

Review Azure Reserved Instances

Azure reserved instances significantly reduce costs—up to 72 per cent compared to pay-as-you-go prices—with one-year or three-year terms on Windows and Linux virtual machines (VMs). What's more, you can now improve budgeting and forecasting with a single upfront payment (i.e. Pay for a VM Upfront for 1/3 Year or 5 Years), making it easy to calculate your investments. Or lower your upfront cash outflow with monthly payment options at no additional cost.

Azure Reserved Instance

The Azure Advisor is an inbuilt tool critical to optimizing the Azure Environment. The Azure Advisor needs to be reviewed for Reserved Instance recommendations.

  1. When reviewing Reserved Instances, you need to take into consideration:
  2. What workloads are they used for?
  3. Is there a project that may replace or resize the workloads next year?
  4. Who is paying for the workloads?

Review unused files and VHDs

Save Azure costs by cleaning up unused VHDs in your Azure storage. Azure stores Azure Virtual Machine OS and data disks in Azure storage accounts.

When a VM is deleted from the Azure portal, the underlying OS and data disks may not get deleted. Such disks continue to consume Azure storage and account for the cost of storing them. These disks are called Orphaned Disks.

As mentioned above, some Virtual Machines with unmanaged disks will keep the VHDs around when deleted.

Using a PowerShell script (provided by Microsoft), you can report on any disks that are not in use by a VM and then delete them.

Note: Be VERY cautious doing this; solutions such as Citrix and Azure Image Builder use unmanaged disks to create new Session hosts, etc., so context is key.

With the Azure Storage accounts using Blob data – such as Diagnostic Accounts, it is a good idea to implement Azure Blob Storage Lifecycle on the storage accounts, so we are only retaining recent and relevant data.

Azure Blob Storage Lifecycle Management

The lifecycle management policy lets you:

  1. Transition blobs to a cooler storage tier (hot to cool, hot to archive, or cool to archive) to optimize for performance and cost
  2. Delete blobs at the end of their lifecycles
  3. Define rules to be run once per day at the storage account level.

Review budgets

Budgets in Cost Management help you plan for and drive organizational accountability. With budgets, you can account for the Azure services you consume or subscribe to during a specific period.

Budgets help you inform others about their spending to proactively manage costs and monitor how spending progresses over time.

When the budget thresholds you've created are exceeded, notifications are triggered. None of your resources is affected, and your consumption isn't stopped; however, you can use Budget alerts as a trigger to run Azure Logic Apps or Functions to automate the shutdown and resize resources. You can use budgets to compare and track spending as you analyze costs.

Azure Budget

Ensure Azure Budget notifications are configured to email Product Owners or other Stakeholders once a Resource Group or Subscription reaches a specific threshold.

This is set up in the Azure Portal, on the Resource Group under Budgets, and set to email the Application Owner.

Examples of budgets that could be configured:

Generally, I recommend that three budgets should be configured to give enough notice:

  • 50%
  • 60%
  • 70%

Review Tags

You apply tags to your Azure resources, resource groups, and subscriptions to logically organize them into a taxonomy. Each tag consists of a name and a value pair. For example, you can apply the name "Environment" and the value "Production" to all the resources in production.

Tags can be used to determine things like:

  • Who to bill?
  • Who supports it?

Azure Portal - Tags

The right tags can mean that the right owners get charged internally and have more ownership of their resource costs. Examples below:

Tag NameValueComment
DeptFinanceName of the department who owns the resources.
EnvironmentUATWhat environment the Resource is used for such as Production, UAT and Development
Application OwnerLuke MurrayThe name of the Product Owner for the service sitting inside the Resource Group
Support TeamPlatform TeamWhat team is responsible for the resources/site for support reasons
Billing CodeOperationalPurchase order or project billing code

For further examples and a base tagging convention, check out a blog article I wrote on Microsoft Azure Tagging conventions.

Review Hub (Hybrid Use Benefit)

The Azure Hybrid Benefit is a pricing benefit for customers with Software Assurance licenses, which helps maximize the value of existing on-premises Windows Server and/or SQL Server license investments when migrating to Azure.

Eligible customers can save up to 40% on Azure Virtual Machines (infrastructure as a service, or IaaS), and save up to 55% on Azure SQL Database (platform as a service, or PaaS) and SQL Server on Azure Virtual Machines (IaaS) with Azure Hybrid Benefit, which increases to up to 80% when combined with Azure Reserved Instances.

Azure - Hybrid Use Benefit

To verify if a server is using the Azure Hybrid Benefit, Log in to the Azure Portal and navigate to the Virtual Machine Blade. Make sure that the: OS Licensing Benefit column is selected.

If a Virtual Machine Already has HUB, it will have: The azure hybrid benefit listed in the column, and any non-supported workloads (such as Linux) will have ‘Not Supported’.

If any are eligible for HUB, click on the Virtual Machine…

  1. Click the Configuration blade
  2. Select Licensing, Already have a Windows server license?
  3. Yes and Save

Note: This is a non-intrusive change that will take effect on the billing immediately and doesn’t cause any impact on the Virtual Machine.

Review Backups

Azure Backup is simple because it’s built into the platform. It has one-click backup support for SQL databases and virtual machines running in Azure.

Azure Backup is cost-effective and less complex than other cloud backup solutions while keeping your data safe from ransomware and human errors. Sometimes there will be workloads backed up to migrate, test, or clone, and you no longer need to retain the data.

Note: This can be a tricky one as you will need to talk to product owners to confirm the workloads were just Dev/Test workloads, and not required, there may be legal implications for keeping workloads in the backup. But if someone stood up something to play with, particularly in a Sandbox or Development subscription there may not be a reason to keep it around.

Log in to the Azure Portal and navigate the Recovery Services Vault page. Navigate to each one and click on:


  1. Under Usage, click on Backup Items

  2. Click on Azure Virtual Machines

  3. Sort the Backup items by Latest Restore Point (so the older restore points are at the top)

    Using the Latest Restore Point as a guide, IF any servers can have their Backups deleted:

  4. Click on the Name of the Backup Item

  5. Click on Stop Backup

  6. Select Delete Backup Data (this is non-reversible)

  7. Type in the name of the Backup Item and select Stop Backup

Review unused Public IPs

Public IP addresses allow Internet resources to communicate inbound to Azure resources. Public IP addresses enable Azure resources to communicate to the Internet and public-facing Azure services.

This is also a great opportunity to inspect what Public IP addresses you have and make sure some resources have public IPs that does not need to be assigned! Tip setup an Azure Policy that prevents the creation of Public IPs.

The address is dedicated to the resource until it’s unassigned by you. A resource without a public IP assigned can communicate outbound. Azure dynamically assigns an available IP address that isn’t dedicated to the resource.

Azure Portal - Public IP Address

When resources get created, sometimes they will create a Public IP; these can be removed as part of the build but left in the Resource Groups.

We want to remove unattached Public IP to save money.

Note: In some cases, the Product Owner may need to be consulted before any changes are made, as some of the resources may be inflight projects or required.

  1. Log in to the Azure Portal and navigate to the Public IP Addresses blade
  2. Look in the ‘Associated to’ column, and if not required, click on the Public IP
  3. Click Delete

Review Azure Storage Accounts

An Azure storage account contains all your Azure Storage data objects: blobs, files, queues, tables, and disks. Your Azure storage account's data is durable, highly available, secure, and massively scalable.

General-purpose storage accounts may be configured for either of the following performance tiers:

  • A standard performance tier for storing blobs, files, tables, queues, and Azure virtual machine disks.
  • A premium performance tier for storing unmanaged virtual machine disks. If a Storage account is Premium but only needs to be Standard (or LRS instead of ZRS), this can save some money.

Note: In some cases, the Product Owner may need to be consulted before any changes are made, as some of the resources may be inflight projects or required.

  1. Log in to the Azure Portal and navigate to the Storage Account blade
  2. Click on Manage View, Edit Columns, and add in: SKU
  3. Review the Premium Storage Accounts and determine if any accounts need to be downsized to Standard
  4. To change, click on the Storage Account
  5. Click on Configuration and change from Premium to Standard

You can also look at the Replication. Does that Storage Account need to be Geo-Redundant if the rest of the application that uses it isn’t? Can the storage account be changed to Standard during off-hours or non-peak?

- Download the PDF version of these Tasks

Finally, if you prefer this in a more PDF/Visual format - you can download a PDF version of this directly from my Github "here".

- Azure Operational Checklist table

This is a very quick example of what an Azure Operational Checklist could look like; if you record what tasks you do, you can look at further automation around implementation and reporting.

Azure Checklist************
Review Azure App Service Plans
Review shutdown workloads
Review Azure Advisor
Review Azure SQL Databases
Review Azure Reserved Instances
Review unused files and VHDs
Review budgets
Review Tags
Review Hub (Hybrid Use Benefit)
Review Backups
Review unused Public IPs
Review Azure Storage Accounts

Connect to Azure SQL Database in a Jupyter Notebook using Python

· 3 min read

Jupyter Notebooks, commonly used by Data Scientists and Students, allow you to run code, such as Python and PowerShell, inside a Notebook format and display the output inside the notebook; this is useful for teaching a subject or displaying up-to-date information.

I am not a python or Jupyter expert, so this article will be brief on how I was able to connect to an Azure SQL Database using Microsoft Entra ID authentication and run a query.

To run a Jupyter Notebook, you can install Anaconda and then use that to download Juypter to run the notebooks from a locally (or server) hosted web-based interface.

However, today I will be using Visual Studio Code with the Jupyter extension on a windows endpoint.

Make sure you install:

Note: Jupyter notebook extensions end in '*.ipynb'.

Once all the prerequisites are installed, it's time to create the Notebook.

  1. Open Visual Studio Code

  2. Click File, New File

  3. Select Jupyter Notebook

  4. Press + Code (to add a Code snippet)

  5. First, we need to import the pyodbc library:

    import pyodbc

Then we need to add the snippet to connect to the SQL database (this can be in a separate Codeblock or the same code block, as long as the import is run before the SQL connection is made - make sure you update the server and database variables, to match your environment!):

#Connection to SQL database

server = ''
database = 'DBNAME'
username = '[email protected]'
password = 'password'

connection = pyodbc.connect('Driver={ODBC Driver 18 for SQL Server};Server='+server+',1433;Database='+database+';Uid='+username+';Pwd='+password+';Encrypt=yes;TrustServerCertificate=no;Connection Timeout=180;Authentication=ActiveDirectoryInteractive')
cursor = connection.cursor()

The 'Authentication=ActiveDirectoryInteractive' parameter as part of the Connection string will prompt an interactive Microsoft Entra ID prompt to display and ask for credentials to be logged in; this includes MFA support. Using this method, the username and password variables are simply placeholders.

If you want to hardcode credentials into the Notebook (not recommended), you can remove the 'Authentication=ActiveDirectoryInteractive' section and enter the credentials into the username and password field.

Now that we have connected to the database, let us run a test query to obtain the SQL version:

#Sample select query
cursor.execute("SELECT @@version;")
row = cursor.fetchone()
while row:
row = cursor.fetchone()

Jupyter python SQL connection

Congratulations, you have successfully connected to an Azure SQL database and ran a query against the database.

If the connection to SQL appears to be stalling, check to make sure the Azure authentication window, isn't hidden behind another window.

A GIST has been created, with the code as well, in case issues are copied from the website.

Create a Public Holidays API using Microsoft Azure

· 15 min read

Using a previous blog post I did on using a third-party API (Application Programming Interface) to start a Virtual Machine when it wasn't a Public Holiday, I had a thought on what could be an option if I wanted an API only accessible on an internal network or if I wanted to include custom Holidays such as Star Wars day or company holidays? Could I create and query my API using Microsoft Azure services? You can!


Today we will create a base Public Holidays API using several Microsoft Azure serverless services, such as Azure Function, Azure Storage Account and API Management.

Note: As this is a demonstration, I will be using a Consumption-based Azure Function and Azure storage account, and although it is a good place to start - depending on your requirements, you may be better off with Azure Function Premium Plan to avoid cold-start times, and if you need a high amount of requests and writes (GET and POSTs) and resiliency, then replace the Storage account table with a Cosmos DB.

The solution will be made up of the following:

Azure ServiceNamePlanNote
Application Insightsai-nzPublicHolidays-prd-ae
Azure API Managementapims-publicholidays-prd-aeDeveloper (No SLA)
Azure Functionfunc-nzpublicHolidays-prd-aeFunction App - Consumption
Azure Storage AccountfuncnzpublicholidaystgacStorageV2 (general purpose v2) - Locally-redundant storage (LRS)Contains 'PublicHolidays' table
Azure Storage Accountrgnzpublicholidayspb4edStorage (general purpose v1) - Locally-redundant storage (LRS)Contains Azure Functions App Files
Resource Grouprg-publicholidays-prd-aeResource Group - containing above resources.

Azure Resource Group - Diagram


Note: AzTables is not part of the standard Az PowerShell module set and is a separate module you will need to install (Install-Module AzTables).

We will use a mix of the Azure Portal and PowerShell to deploy this solution from start to finish; you can find the source data and code directly in the GitHub repository here: lukemurraynz/PublicHoliday-API for reference (feel free to fork, raise pull requests etc.). In this guide, I will try not to assume preexisting knowledge (other than general Azure and PowerShell knowledge).


The deployment steps will be separated into different sections to help simplify implementation.

First, make sure you adjust the names of your resources and locations to suit your naming conventions and regional locations (such as Australia East or West Europe). Your deployments may fail if a name is already in use. See "Microsoft Azure Naming Conventions" for more about Naming conventions.

Create Resource Group

The Resource Group will contain all resources related to the API that we will deploy today.

However, I recommend you consider what resources might be shared outside of this API - such as API Management, and put them in a separate Shared or Common Resource Group, to keep the li.e.ecycle of your resources together (ie API resources all in one place, so if it gets decommissioned, it is as easy a deleting the Resource Group).

  1. Log in to the Microsoft Azure Portal
  2. Click Click on the burger and click Resource groups
  3. Click + Create
  4. Select your Subscription
  5. Type in a name for your Resource Group (like 'rg-publicholidays-prd-ae')
  6. Select your Region and click Next: Tags
  7. Enter in applicable tags (i.e. Application: Public Holidays API)
  8. Click Next: Review + create
  9. Click Create

Create a resource group

If you prefer PowerShell, you can deploy a new Resource Group with the below:

New-AzResourceGroup -Name 'rg-publicholidays-prd-ae' -Location 'Australia East' -Tag @{Application="Public Holidays API"}

Create Storage Account

Now that the Resource Group has been created, it's time to import our Storage Account - which will hold our Table of data around Public Holidays.

  1. Log in to the Microsoft Azure Portal
  2. Click Click on the burger and click Storage Accounts
  3. Click + Create
  4. Select the Subscription and Resource Group you created earlier
  5. Enter in a Name for your Storage Account (like 'funcnzpublicholidaystgac')
  6. Select your Region (i.e. Australia East)
  7. For Performance, I am going to select: Standard
  8. For Redundancy, as this is a demo, I will select Locally-redundant storage (LRS). However, if you plan on running this in production, you may consider ZRS for zone redundancy.
  9. If you plan on locking down the Storage Account to your Virtual Network or specific IP addresses, continue to the Networking Tab; we can accept the defaults and click: Review.
  10. Click Create

If you prefer PowerShell, you can deploy a new Storage account with the below:

New-AzStorageAccount -ResourceGroupName 'rg-publicholidays-prd-ae' -Name 'funcnzpublicholidaystgac' -Location 'Australia East' -SkuName 'Standard_LRS' -Kind StorageV2

Import Public Holiday data

Create Azure Storage Account Table

Now that we have the Storage account that will hold our Public Holiday time to import the data.

Most of this task will be done with PowerShell, but first, we need to create the Table that will hold our Public Holidays.

  1. Log in to the Microsoft Azure Portal
  2. Click Click on the burger and click Storage Accounts
  3. Navigate to your created Storage account
  4. In the Navigation blade, click Tables
  5. Click + Table
  6. For a Table Name, I will go with PublicHolidays
  7. Click Ok

Create Azure Storage Account Table

You can use PowerShell to create the Table below:

$storageAccount = Get-AzStorageAccount -ResourceGroupName 'rg-publicholidays-prd-ae' -Name 'funcnzpublicholidaystgac'
$storageContext = $storageAccount.Context
New-AzStorageTable -Name 'PublicHolidays' -Context $storageContext
Import Public Holiday Data into Table

Now that we have the Azure storage account and PublicHolidays table, it's time to import the data.

If you want to do this manually, the Azure Table will have the following columns:

| Date | Country | Type | Name | Day | Year | Comments |

We could enter the data manually, but I will leverage the Nager API to download and parse a CSV file for a few countries. You can find the source data and code directly in the GitHub repository here: lukemurraynz/PublicHoliday-API for reference.

To do this, we will need PowerShell, so assuming you have logged into PowerShell and set the context to your Azure subscription, let us continue.

I have created a CSV (Comma-separated values) file with a list of countries (i.e. US, NZ, AU) called 'SourceTimeDate.CSV', but you can adjust this to suit your requirements and place it in a folder on my C:\ drive called: Temp\API.

Open PowerShell and run the following:

$Folder = 'C:\Temp\API\'
$Csv = Import-csv "$Folder\DateTimeSource\SourceTimeDate.csv"
$CurrentYear = (Get-Date).Year

ForEach ($Country in $Csv)
$CountryCode = $Country.Country
Invoke-WebRequest -Uri "$CountryCode/$CurrentYear/CSV" -OutFile "$FolderAPI\DateTimeSource\Country$CountryCode$CurrentYear.csv"

These cmdlets will download a bunch of CSV files into the API folder, with the Public Holidays for each Country for this year, and then you can adjust the $CurrentYear variable for future years (i.e. 2025).

Once you have all the CSV files for your Public Holidays and before we import the data into the Azure storage table, now is the time to create a new Custom Holidays CSV; you can easily use an existing one to create a new CSV containing your company's public holidays or other days that may be missing from the standard list, make sure it matches the correct format and save it into the same folder.

Custom Public Holidays API

Now that you have all your CSV files containing the Public Holidays in your Country or countries, it's time to import them into the Azure Table. First, we import the data using a PowerShell session logged into Azure.

# Imports Public Holiday into Azure Storage table
# Requires AzTable Module (not part of the normal Az cmdlets)
Import-Module AzTable

#Imports data from CSV files into $GLobalHolidays variable
$Folder = 'C:\Temp\API\'

$GlobalHolidays = Get-ChildItem "$Folder\DateTimeSource\*.csv" | Foreach-Object {
$basename = $_.BaseName
import-csv $_

#Connects to Azure Storage Account
$storageAccountName = 'funcnzpublicholidaystgac'
$resourceGroupName = 'rg-publicHolidays-prd-ae'
$tableName = 'PublicHolidays'
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
$storageContext = $storageAccount.Context
$cloudTable = (Get-AzStorageTable -Name $tableName -Context $storageContext).CloudTable

#Imports CSV data into Azure Table
$counter = 0
ForEach ($Holiday in $GlobalHolidays)

$Date = [DateTime]($Holiday.Date)
$Dayofweek = $Date.DayOfWeek | Out-String
$Year = $Date.Year
$HolidayDate = Get-Date $Date -format "dd-MM-yyyy"

Add-AzTableRow `
-table $cloudTable `
-partitionKey '1' `
-rowKey ((++$counter)) -property @{"Date"=$HolidayDate;"Country"=$Holiday.CountryCode;"Type"=$Holiday.Type;"Name"=$Holiday.LocalName;"Day"=$Dayofweek;"Year"=$Year;"Comments"=$Holiday.Counties}


#Validate the data in the Storage table
Get-AzTableRow -table $cloudTable

Import CSV to Azure Storage Account

Validate Azure Storage Account Table

Now the Public Holidays are imported into the Azure storage account table with additional information, such as the Day it falls, and the Date format has been changed to suit the NZ format (DD-MM-YYYY).

If we log in to the Azure Portal, navigate to the Storage account and under Storage Browser, we can now see our Table is full of Public Holidays.

Create API

That we have our Table with Public Holiday data, it's time to create our Azure Function to act as the API that will talk to the azure storage account!

Create Azure Function
  1. Log in to the Microsoft Azure Portal
  2. Click Click on the burger and click Resource groups
  3. Navigate to your resource group and click + Create
  4. Search for: Function
  5. Select Function App, and click Create
  6. Enter your Function App Name (i.e. 'func-nzpublicHolidays-prd-ae')
  7. For Runtime Stack, select PowerShell Core
  8. Select the latest version (at this time, it's 7.2)
  9. Select your Region
  10. Select Windows
  11. Set your Plan (in my example, its Consumption (Serverless))
  12. Click Review + Create
  13. Click Create

Azure Function - Create

Configure Environment Variables

Now that the Function App has been created before creating the GetPublicHoliday function, we need to add a few environment variables that the Function will use; these variables will contain the ResourceGroup and Storage account name.

  1. Navigate to your Azure Function
  2. Click Configuration
  3. Click + New application setting
  4. Under the name, add: PublicHolidayRESOURCEGROUPNAME
  5. For value, type in the name of your resource group.
  6. Add a second application setting named: PublicHolidaySTORAGEACCNAME
  7. For value, type in the name of your storage account that contains the Public Holiday table.
  8. Click Save (to save the variables).

Azure Function - Variables

Configure Managed Identity

Next, we need to give the Function App the ability to read the Azure storage account. To do this, we need to configure a System assigned managed identity.

  1. Navigate to your Azure Function
  2. Click Identity
  3. Under the System assigned heading, toggle the status to On
  4. Click Save
  5. Select Yes, to enable the System assigned managed identity
  6. Under Permissions, click Azure role assignments
  7. Click + Add role assignment
  8. For Scope, select Storage
  9. Select your Subscription and storage account containing your Public Holiday data
  10. For role, select Contributor (Storage Table Data Reader is not enough).
  11. Click Save
Configure Requirements

The Azure function app will rely on a few PowerShell Modules; for the FunctionApp to load them, we need to add them to the requirements.psd1 file.

  1. Navigate to your Azure Function

  2. Click App files

  3. Change the dropdown to requirements.psd1

  4. In the hash array, comment out the #Az module line (as this will load the entire Az Module set, which will cause an increased delay in the startup as those extra modules aren't needed), and add the following:

    # This file enables modules to be automatically managed by the Functions service.
    # See for additional information.
    # For latest supported version, go to ''.
    # To use the Az module in your function app, please uncomment the line below.
    #'Az' = '8.*'
    'Az.Accounts' = '2.*'
    'Az.Storage' = '4.*'
    'Az.Resources' = '2.*'
    'AzTable' = '2.*'

  5. Click Save

Create Function PublicHolidays

Now that the Function App has been configured, it is time to create our Function.

  1. Navigate to your Azure Function
  2. Click Functions
  3. Click + Create
  4. Change Development environment to Develop in Portal
  5. Select Template, an HTTP trigger
  6. For the New Function name, I will go with GetPublicHoliday
  7. Change Authorization level to Anonymous (if you aren't going to implement API Management, select Function and look at whitelisting your IP only, we will be locking it down to API Management later).
  8. Click Create

Create Azure Function App

  1. Click Code + Test

  2. Copy the following Code into the run.ps1 file, this code is core to the Function that will read the HTTP request and bring back a PowerShell object with the Public Holiday information as part of a GET request:

    <# The code above does the following, explained in English:
    1. Read the query parameters from the request.
    2. Read the body of the request.
    3. Write to the Azure Functions log stream.
    4. Interact with query parameters or the body of the request.
    5. Associate values to output bindings by calling 'Push-OutputBinding'. #>

    using namespace System.Net

    # Input bindings are passed in via param block.
    param([Parameter(Mandatory = $true)]$Request, [Parameter(Mandatory = $true)]$TriggerMetadata)

    # Write to the Azure Functions log stream.
    Write-Host 'GetPublicHoliday function processed a request.'

    # Interact with query parameters or the body of the request.
    $date = $Request.Query.Date
    $country = $Request.Query.CountryCode

    $resourceGroupName = $env:PublicHolidayRESOURCEGROUPNAME
    $storageAccountName = $env:PublicHolidaySTORAGEACCNAME
    $tableName = 'PublicHolidays'

    $ClientIP = $Request.Headers."x-forwarded-for".Split(":")[0]

    try {

    $storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
    $storageContext = $storageAccount.Context
    $cloudTable = (Get-AzStorageTable -Name $tableName -Context $storageContext).CloudTable
    Import-Module AzTable

    $Tables = Get-AzTableRow -table $cloudTable

    ForEach ($table in $Tables)

    [string]$Filter1 = [Microsoft.Azure.Cosmos.Table.TableQuery]::GenerateFilterCondition("Country", [Microsoft.Azure.Cosmos.Table.QueryComparisons]::Equal, $country)
    [string]$Filter2 = [Microsoft.Azure.Cosmos.Table.TableQuery]::GenerateFilterCondition("Date", [Microsoft.Azure.Cosmos.Table.QueryComparisons]::Equal, $date)
    [string]$finalFilter = [Microsoft.Azure.Cosmos.Table.TableQuery]::CombineFilters($Filter1, "and", $Filter2)
    $object = Get-AzTableRow -table $cloudTable -CustomFilter $finalFilter

    $body = @()

    $System = New-Object -TypeName PSObject
    Add-Member -InputObject $System -MemberType NoteProperty -Name CountryCode -Value $object.Country
    Add-Member -InputObject $System -MemberType NoteProperty -Name HolidayDate -Value $object.Date
    Add-Member -InputObject $System -MemberType NoteProperty -Name HolidayYear -Value $object.Year
    Add-Member -InputObject $System -MemberType NoteProperty -Name HolidayName -Value $object.Name
    Add-Member -InputObject $System -MemberType NoteProperty -Name HolidayType -Value $object.Type
    Add-Member -InputObject $System -MemberType NoteProperty -Name Comments -Value $object.Comments
    Add-Member -InputObject $System -MemberType NoteProperty -Name RequestedIP -Value $ClientIP

    $body += $System
    $System = New-Object -TypeName PSObject

    $status = [Net.HttpStatusCode]::OK


    catch {
    $body = "Failure connecting to table for state data, $_"
    $status = [Net.HttpStatusCode]::BadRequest
    #$body = $TriggerMetadata

    # Associate values to output bindings by calling Push-OutputBinding'
    Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
    StatusCode = $status
    Body = $body
  3. Click Save

Test Function - PublicHolidays

Before proceeding with the next step, it's time to test the function.

  1. Navigate to your Azure Function
  2. Click Functions
  3. Click GetPublicHoliday
  4. Click Code + Test
  5. Click Test/Run
  6. Change HTTP method to Get
  7. Under Query, add Country value and Date value.

Note: Make sure the date and country formats match what is in the Azure storage account.

You can also Invoke the function app directly with PowerShell, with the Date and Country as Parameters at the end:

Invoke-RestMethod -URI ""

Test Public Holiday API

Congratulations! You have now created a Public Holiday API that you can call for automation! You can lock down the Function App to only certain IPs or proceed to configure Azure API Management.

Configure Azure API Management

Now that the Function App responds to requests, we can expose the HTTP endpoint through Azure API Management. Azure API Management will give greater flexibility and security over API endpoints, particularly when dealing with more than one API. Azure API Management also offers inbuilt shared cache functionality and integration into Azure Cache for Redis.

  1. Log in to the Microsoft Azure Portal
  2. Navigate to your Azure Function
  3. On the Navigation blade, select API Management
  4. Click Create New
  5. Select your subscription, Region, and organisation name.
  6. Select a Pricing Tier
  7. Click Review + Create
  8. Click Create

Create Azure API Management

  1. Wait for 10 minutes to half an hour for provisioning to take place, and Azure API Management will be in an activating state.

  2. Once API Management has been provisioned, you can copy the Virtual IP (VIP) addresses of API Management and restrict your function app to only allow inbound access from that IP.

  3. Once you have done that, add the GetPublicHoliday function app into Azure API Management, add the paths to add a version, and then, using the subscription key, you run the following command to pull data.

    Invoke-RestMethod -uri "$KEY"