Skip to main content

Update your Azure WebApp to be Always On

· 3 min read

By default, Azure Web Apps are unloaded if they are idle for a set period of time (20 minutes). This way, the system can conserve resources.

The downside is that the response to the first request after the web app is unloaded is longer, as the Web App has to load into memory and present itself, which could lead to a bad user experience.

To ensure that the Azure App Service Web App is running and always available to respond to incoming HTTP(S) requests, can be set the "Always On" configuration feature to "On".

Overview

By setting the "Always On" feature of the App Service Web App to “On”, will ensure that Azure will always keep an instance of the Web App running at all times. This way when a user/client hits the Azure Front Door endpoint, the back-end Web App will always be ready to respond to that request without timing out. This will ensure the application is always available even during times of low usage or inactivity. Azure will continuously ping the website to keep the website alive.

Enabling "Always On" keeps your Apps always loaded, even when there is no traffic. It’s required for example when using continuous Web Jobs or for Web Jobs that are triggered using a CRON expression, this feature is similar to the Internet Information Services (IIS) idle time-out property.

Disabling "Always On" makes your Apps unloaded if they are idle for a set period of time. This way, the system can conserve resources. This is the reason “Always On” is set as disabled by default.

You need a minimum of Basic and Standard App Service Tiers to enable "Always On"

Having "Always On" off or on does not affect your billing or pricing, the Azure App Service billing is done at the App Service Plan level, which is charged per hour that the App Service Plan exists and is running, this is charged whether you have a WebApp or its set to "Always On" or not.

Configure Always On

Note: Any changes to applications settings and connection strings could restart your application, so make sure that you schedule this during a period you can have an intermittent outage.

  1. Log in to the Azure Portal
  2. Find your App Service and open it
  3. In the left hand Blade, under Configuration, select General Settings App Service - General Settings
  4. Navigate down to Platform Settings and click 'On' to Always On App Service - Always On
  5. Click Save

Configure Always On through Azure Policy

The Azure policy below, contains a set of 3 policies which used in combination should ensure that your web app is always on. Since the alwaysOn property is optional in the ARM deployment, a combination of 3 policies are required to ensure true compliance in the environment.

References

For more information about the "Always On" feature please see the documentation below:

Update your Azure WebApp to use your timezone

· 2 min read

By default, the timezone in Microsoft Azure defaults to UTC (Universal Coordinated Time) as a standard, as a universal and consistent timezone, this makes sense - however when troubleshooting issues or attempting to schedule jobs, having the time in UTC may add additional confusion. An Azure WebApp is no exception to UTC as a standard, however, this can be changed.

As I am in 'New Zealand', I will be setting my WebApp (which is hosted in Australia East) to NZ time from UTC.

I will be using a Windows-based App Service, for this article.

Find the Timezone

The Azure App Service uses the same naming standard as Windows.

  1. To find the correct name, run the following PowerShell snippet on a Windows PC:

    Get-ChildItem -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zones' | Select-Object PSChildName

Windows Terminal - Timezone

This will list all the compatible Timezone names, because I am in New Zealand, I now know that: 'New Zealand Standard Time' is the correct syntax.

For future reference, I have exported the list of compatible timezones into a CSV file below:

Set the Timezone

Making a change to the Application Settings, which includes setting the Timezone will restart the WebApp app pool, so make sure this is scheduled at a time it is acceptable for an intermittent outage.

  1. Log in to the Azure Portal
  2. Find your Azure WebApp and open it up
  3. On the left-hand side Blade, underneath Settings, click on Configuration
  4. Click on + New Application Setting
  5. Type in the following Key/Value pair: WEBSITE_TIME_ZONE | New Zealand Standard Time
  6. Click Ok
  7. Click Save to confirm and save the change.

Azure WebApp - Timezone

Test the Timezone

  1. Log in to the Azure Portal
  2. Find your Azure WebApp and open it up
  3. On the left-hand side Blade, underneath Development Tools, click on Console
  4. Type the following into the console:
  • Time
  • Date

These commands can help you confirm, that the Date and Time now your Timezone, the Azure WebApp logs etc will now be updated to use your timezone.

Azure App Service - Console

Run PowerShell App Deployment Toolkit in Datto RMM

· 2 min read

The PowerShell App Deployment Toolkit provides a set of functions to perform common application deployment tasks and to interact with the user during deployment. It simplifies the complex scripting challenges of deploying applications in the enterprise, provides a consistent deployment experience and improves installation success rates.

PowerShell App Deployment Toolkit

Although the PowerShell App Deployment Toolkit, makes application installation a lot more visible and gives your users more control over how and when the Application is installed, due to some technical limitations, you can't run the PowerShell App Deployment Toolkit, directly from the Datto RMM package store.

This is a brief article, intended to help other people who may be using the App Deployment Toolkit with Datto RMM.

DattoRMMpowerShellAppDeploymentToolkitCommand.ps1

#This is the name of the zip file in the component. Make sure that the PowerShell App Deployment Toolkit is zipped.
$ZipFile = "DesktopSOE.zip"
#This will create a folder called: C:\Temp (these folders can be changed to suit your requirements)
Mkdir c:\Temp -Force
#This will create a folder called: C:\Temp\DesktopSOE\ (these folders can be changed to suit your requirements)
Mkdir C:\Temp\DesktopSOE\ -Force
#This will then copy your PowerShellAppDeployment Toolkit to a folder, outside of the CentraStage Packagestore location.
Copy-Item -Path "$ZipFile" -Destination "C:\Temp\$ZipFile" -Recurse
$DestinationFolder = $ZipFile.Split(".")[0]
#This will then extract your PowerShell App Deployment Toolkit and run it.
Expand-Archive -Path "c:\temp\$ZipFile" -DestinationPath "C:\Temp\DesktopSOE\" -Force
Invoke-Command { c:\temp\DesktopSOE\Deploy-Application.exe }

Note: You may also need to navigate to: AppDeployToolkitConfig.xml, and change the: <Toolkit_RequireAdmin> attribute to False, to avoid issues with UAC (User Access Control).

I also ran my component as:

  • Only Run when the user is logged in
  • Only run if User has Administrator rights

Full end to end encryption on an Azure WebApp using Cloudflare

· 7 min read

Cloudflare offers many capabilities; one of the capabilities it offers is SSL offloading and CNAME flattening.

When setting up an Azure Web App using default settings, it is set up using HTTP, not HTTPS, so we will set the WebApp to your custom domain, then use Cloudflare to protect traffic from your user's browsers to Cloudflare, then encrypt traffic from Cloudflare to your website.

We will go through both setups, with the result being full end-to-end encryption of your Azure WebApp using Cloudflare and your custom domain.

Using Cloudflare without a backend Certificate

Using Cloudflare without a backend Certificate

Using Cloudflare with a backend Certificate

Using Cloudflare with a backend Certificate

By default, Azure WebApps have a wildcard cert for the following domains:

  • *.azurewebsites.net
  • With Subject alternative names for:
  • *.scm.azurewebsites.net
  • *.azure-mobile.net
  • *.scm.azure-mobile.net
  • *.sso.azurewebsites.net

badasscloud - azurewebsites.net secure

This certificate allows you to use HTTPS using the default azurewebsites URL, which gets created when you create your Azure WebApp and is completely managed by Microsoft and the Azure ecosystem. Still, if you want to use your own Custom Domain, then these certificates won't work.

Prerequisites

  • Azure WebApp (supports Custom Domain SSL support, Custom Domains/SSL support are available from ‘B1’ plans and upwards.)
  • Cloudflare account (can be free)
  • Domain (an actual custom domain to use for your website that is already setup to use Cloudflare nameservers)
  • PfxCreator

Add a Custom Domain to your Azure WebApp using Cloudflare

  1. Login into the Azure Portal
  2. Navigate to your App Service.
  3. Underneath Settings on the left-hand side blade of the App Settings, look for Custom Domains and select it.
  4. Click on ‘Add Custom Domain’.
  5. type in your custom domain (in my example, I am using a domain I own called: badasscloud.com)
  6. Select Validate; you will have a similar seen to me below; select CNAME. Azure - Add Custom Domain
  7. Now we need to validate that you are the one who owns the domains and can use it for your WebApp, so we will need to create some records to verify that you own the domain and redirect the website to the Azure Websites.
  8. Login to Cloudflare
  9. Select SSL/TLS and make sure that ‘Flexible’ SSL has been selected.
  10. Select DNS Note: You may need to remove any A records for ‘www’ or the root domain ‘@’ you have set; please make sure you have a reference to them in case you need to roll back any changes because we will be redirecting the main URL to an Azure DNS alias, we will be using Cloudflare CNAME flattening at the root level, so anyone going to ‘badasscloud.com’ will be redirected to the Azure WebApp.
  11. You can also use the txt record to validate the domain and do some reconfiguration without changing the domain and redirecting traffic ahead of your change to avoid downtime.
  12. Add in the records to Cloudflare (please note that verification will fail if Cloudflare proxy is turned on, so make sure that the proxy status is set to DNS only)
  13. Navigate back to the Azure Portal.
  14. Click on Validate again and select CNAME.
  15. Hostname availability and Domain ownership should be both green.
  16. Add Custom Domain. Azure - Add Custom Domain
  17. If they are still Red, wait a few minutes for Cloudflare to replicate the changes across its Networks and Azure to clear any server-side caching, verification can fail if you try to verify straight away.
  18. Now that Domain verification has been completed navigate Cloudflare and enable the Cloudflare proxy for your root domain and www record.
  19. Navigate and test your website. Now that the domain has been added to the Azure WebApp and Cloudflare proxy has been enabled, your website will now have a certificate supplied by Cloudflare. You have now set up Flexible SSL traffic to your website, so traffic between users’ browsers to Cloudflare is now encrypted. badasscloud.com - Cloudflare Certificate

Update your WebApp to support ‘Full’ end-to-end using Cloudflare origin certificate

Adding your domain to Cloudflare was only the first part of the puzzle; although traffic between the browser and Cloudflare is now encrypted, traffic between Cloudflare and your WebApp is not; to encrypt this traffic, we are going to use the Cloudflare origin certificate.

Cloudflare Origin Certificates are free SSL certificates issued by Cloudflare for installation on your origin server to facilitate end-to-end encryption for your visitors using HTTPS. Once deployed, they are compatible with the Strict SSL mode. By default, newly generated certificates are valid for 15 years, but you can change this to 7 days.

  1. Log in to Cloudflare
  2. Click on SSL/TLS
  3. Click on Origin Server
  4. Click on Create Certificate Cloudflare - Origin Certificate
  5. Verify that the Private Key Type is RSA (2048)
  6. Make sure that the Hostnames you want to be covered under the origin cert is covered.
  7. Verify certificate validity, in my example, and I am going with 15 years; remember to keep this certificate validated and updated. Cloudflare - Origin Certificate
  8. Click Create
  9. Cloudflare will now generate your Origin certificate and Private key (save these somewhere secure, the private key will not be shown again).
  10. Now we need to create a certificate PFX file to upload to the Azure WebApp, run PfxCreator.exe (see Prerequisites for download link)
  11. Paste the Origin Certificate into the: Certificate (PEM)
  12. Paste the Private Key into the Private Key (PEM) PfxCreator
  13. Type in a password for the certificate
  14. Click Save PFX… and save your certificate.
  15. Login into the Azure Portal
  16. Navigate to your App Service.
  17. Underneath Settings on the left-hand side blade of the App Settings, look for Custom Domains and select it.
  18. You should see the SSL state of your domain as ‘Not Secure’, and under SSL Binding, you will have an option to Add Binding, click on Add Binding.
  19. Select your Custom Domain and click Upload PFX Certificate
  20. Click File and browse for your certificate.
  21. Type in the password you entered PFXCreator earlier. Azure Portal - Add Private Certificate
  22. Click on Upload.
  23. Once uploaded, select your Custom Domain.
  24. Select the Cloudflare Origin Certificate
  25. Make sure the TLS/SSL type is: SNI SSL and click Add Binding. Azure Portal - Add Private Certificate
  26. The SSL State of your Custom Domain should now have been changed to Secure.
  27. Click on HTTPS Only Note: You may see constant redirect issues with your website until the following Cloudflare changes have been made. Azure Portal - Enable HTTPS
  28. Login to Cloudflare
  29. Select SSL/TLS and make sure that ‘Full (Strict)’ has been selected.
  30. Give it 30 seconds to a minute to take effect, and you have now successfully encrypted traffic end-to-end on your website, from the browser to Cloudflare and from Cloudflare to your Azure WebApp.

#ProTip - If you want to be more secure, you can look into blocking access to your website from Cloudflare and a few select IPs for testing only to avoid traffic from bypassing Cloudflare and going to the azure websites URL.

Azure Blob and Azure Lifecycle Management

· 7 min read

Azure Blob storage (Platform-as-a-service (PaaS)) is used for streaming and storing documents, videos, pictures, backups, and other unstructured text or binary data… however the functionality extends beyond just a place to “store stuff”, it can save you money and time by automating the lifecycle of your data using Azure Blob Lifecycle Management and access tiers.

As of January 2021, Blob storage now supports the Network File System (NFS) 3.0 protocol. This support provides Linux file system compatibility at object storage scale and prices and enables Linux clients to mount a container in Blob storage from an Azure Virtual Machine (VM) or a computer on-premises.

Blobs - “Highly scalable, REST-based cloud object store”

  • Data sharing, Big Data, Backups
  • Block Blobs: Read and write data in blocks. Optimized for sequential IO. Most cost-effective Storage. Ideal for files, documents & media.
  • Page Blobs: Optimized for random access and can be up to 8 TB in size. IaaS VM OS & data disks and backups are of this type.
  • Append Blobs: Like block blobs and optimized for append operations. Ideal for logging scenarios and total size can be up to 195 GB.

Aren’t there only 2 access tiers?

When you create an Azure Storage account, you get presented with 2 options for the Access Tier:

  • Hot
  • Cool

Hot access tier

The hot access tier has higher storage costs than cool and archive tiers, but the lowest access costs. Example usage scenarios for the hot access tier include:

  • Data that is in active use or is expected to be read from and written to frequently.
  • Data that is staged for processing and eventual migration to the cool access tier

Cool access tier

The cool access tier has lower storage costs and higher access costs compared to hot storage. This tier is intended for data that will remain in the cool tier for at least 30 days. Example usage scenarios for the cool access tier include:

  • Short-term backup and disaster recovery
  • Older data not used frequently but expected to be available immediately when accessed.
  • Large data sets need to be stored cost-effectively, while more data is being gathered for future processing.

These options are set globally for your Azure Storage account blobs, however, there is a third tier, the Archive Access Tier:

Archive access tier

The Archive access tier has the lowest storage cost, but higher data retrieval costs compared to hot and cool tiers.

Data must remain in the archive tier for at least 180 days or be subject to an early deletion charge. Data in the archive tier can take several hours to retrieve depending on the specified rehydration priority.

While a blob is in archive storage, the blob data is offline and cannot be read or modified. To read or download a blob in the archive, you must first rehydrate it to an online tier.

How is this charged?

Depending on which tier your data is in, depends on the costs, Azure Blob Storage is charged on Read/Write and list operation and other factors, for example:

  • Hot Tier: Lower access prices for frequent use
  • Cool Tier: Lower storage prices for high volume
  • The volume of data stored per month.
  • Quantity and types of operations performed, along with any data transfer costs.
  • Data redundancy option selected.

More information here: https://azure.microsoft.com/en-us/pricing/details/storage/blobs/

What is data lifecycle management?

There are many versions of it, but at its core, there are 5 stages to simple data lifecycle management:

  • Creation – When the data is first created.
  • Storage -Where the data is stored.
  • Usage – When the data is useful and relevant and used.
  • Archival – When the data is not as useful, but still helpful to have around due to knowledge or legal requirements.
  • Destruction – When the data is completely irrelevant and there is no need to store or use it anymore.

Right... so, tell me more about the Azure Blob Lifecycle Management?

Azure Blob Storage has a lifecycle management feature built-in. Azure Blob Storage lifecycle management offers a rich, rule-based policy for General Purpose v2 and blob (and Premium Block blob) storage accounts.

  • Imagine you working on a project, such as purchasing a new company you not only want somewhere to store that data, but you want to make sure it is accessible quickly, so you put it in an Azure Blob Storage account under the Hot Tier.
  • You’ve then spent some time working on new documents using the data you acquired when you purchased the ‘new’ company, but don’t touch them anymore, you don’t want them sitting on fast storage costing you additional money, so they get migrated to a ‘Cool’ access tier.
  • A few months later, you realized that you needed some of the original data from the company acquisition, you find the files and use them, it took a bit longer to open as the data needed to be migrated to the ‘hot tier’ but you are happy because the data that you want was there.
  • A year later, you are onto acquiring another company and the data from the company acquisition which seemed a lifetime ago is forgotten about, however, you know you might need it for legal or finance auditing purposes, the data goes into the Archive tier, costing you less than the cool tier, but could be reacquired at a later date if needed (for an extra charge).
  • 7 years down the track, you’re now a multi-million-dollar firm, and have completely forgotten or no longer need the data from your original acquisition, the data then gets deleted, saving you money and data management costs.

Microsoft Azure and Lifecycle Management for Blob Storage automate the entire lifecycle for you.

How do I enable or configure Azure Blob Lifecycle Management?

  1. Log in to the Azure Portal
  2. Find the Azure storage account you want to configure Lifecycle Management on
  3. On the Storage account left-hand side Blade, under Blob Service click on Lifecycle Management
  4. Click on Add a rule
  5. Enter in a Rule name any name that suits your naming standards, for example, AzureBlobLifecyclePolicy. Azure Blob Lifecycle Policy Note: Make sure Append Blobs is unselected, this is un-supported for moving access tiers (however supports being deleted after x amount of days).
  6. Click Next
  7. This is where the magic happens, we are going to go with the following: Azure Base Blob Policies
  8. Base Blobs that were last modified 90 days ago will be moved to Cool storage.
  9. Click on + Add if-then block, now we will select the Archive Storage, the example we will now archive data that has been in Cool storage for 90 days, so we enter in: 180 days. Note: Migrating the data between Access Tiers, does not change the last modified date of the file, so it's 90 days for migrating to Cool, then another 90 days to move to archive.
  10. Click on + Add if-then block, now we will select the Delete the blob, data that has been in Archive storage for 90 days will now be deleted, so we enter in: 270 days.
  11. Click Next and do the same for Snapshots and versions and click Save.
  12. Congratulations, you have now created an Azure Blob Lifecycle policy!

Once the Policy has been saved, it is Enabled by default. You can disable it by selecting the Policy and select Disable on the top banner.

!Azure Blob Lifecycle Management

#ProTip - You can also view the policy as Code in Code View, which is a simple and quick way of documenting and modifying your lifecycle policy.

#ProTip - You can have multiple Lifecycle Policies on a single storage account.

#ProTip - You can learn more about Lifecycle policies by going to the Microsoft documentation here: Optimize costs by automating Azure Blob Storage access tiers.

#ProTip - If you are looking for integration with Azure AD or Active Directory NTFS permissions, replicating data from fileservers, you are better off looking at Azure File Shares and not blob storage.