Skip to main content

· 5 min read

Cloud computing offers many benefits, from your traditional on-premises infrastructure, ecosystems such as Microsoft Azure, have an underlying fabric built for today's 'software as a service' or 'software defined' world.

The shift of technologies from managing on-premises Exchange environments for mail to consuming Microsoft 365 services has allowed more time for the IT and businesses to adopt, consume and improve their technology and continuously improve - to get the most use of it and remain competitive in this challenging world.

Below is a high-level list of what I consider some of the benefits of using the Microsoft Azure ecosystem:

  • Each Azure datacentre 'region' has 3 Availability Zones, each zone acts as a separate datacentre, giving redundant power and networking services, quickly allowing you to separate your services across different fault domains and zones, providing better resiliency, while also giving you the ability to keep them logically and physically close together.
  • Geo-redundant replication of backups for Virtual Machines, PaaS/File Shares, and ability to do cross-region restore functionality (i.e., Australia/Australia East).
  • A multitude of hosts (supporting both AMD and Intel workloads), which are continually patched and maintained, and tuned for virtualisation performance, stability and security, no longer do we need to spend hours patching, maintaining, licensing on-premises hypervisors, ever so increasing as these systems get targeted for vulnerabilities and architecting how many physical hosts, we may need to support a system.
  • Consistent, up-to-date hardware, no need to worry about lead times for new hardware, purchasing new hardware every three years and procurement and implementation costs of hardware, allowing you to spend the time improving on the business and tuning your services (scaling up and down, trying new technologies, turning off devices etc.)
  • For those that like to hoard every file that ever existed, the Azure platform allows scale (in and out to suit your file sizes) along with cost-saving opportunities and tweaks with Automation and migrating files between cool/hot tiers.
  • No need to pay datacentre hosting costs
  • No need to worry about redundant switching
  • With multiple hosts, there is no risk around air conditioning leaks, hardware failure; you don't need to worry about some of these unfortunate events occurring.
  • No need to pay electricity costs to host your workloads.
  • Reduced IT labour costs and time to implement and maintain systems
  • OnDemand resources available can stand up separate networks unattached to your production network for testing or other devices easily without working out through VLANs or complex switching and firewalls.
  • Azure Network have standard DDOS protection enabled by default
  • Backups are secure by default; they are offline and managed by Microsoft, so if a ransomware attack occurs, won't be able to touch your backups.
  • Constant Security recommendations, improvements built into the platform.
  • Azure Files is geo-redundant and across multiple storage arrays, encrypted at rest.
  • Windows/SQL licensing is all covered as part of the costings, so need to worry about not adhering to MS licensing, Azure helps simplify what can sometimes be confusing and complex licensing.
  • Extended security updates for out-of-date Server OS such as Windows Server 2008 R2, Windows Server 2021 R2 without having to pay for extended update support.
  • Ability to leverage modern and remote desktop and application technologies such as Windows 365 and Azure Virtual Desktop, by accessing services hosted in Azure.
  • Having your workloads in Azure gives you a step towards, removing the need for traditional domain controllers and migrating to Microsoft Entra ID joined devices.
  • Azure AutoManage functionality is built in to automatically patch Linux (and Windows of course!), without having to manage separate patching technologies for cross-platform infrastructure.
  • Azure has huge support for Automation, via PowerShell, CLI and API, allowing you to standardize, maintain, tear down and create infrastructure and services, monitoring, self-users on an as needed basis.
  • Azure datacentres are sustainable and run off renewable energy where they can, Microsoft has commitments to be fully renewable.
  • No need for NAS or Local Backups, the backups are all built into Azure.
  • Compliant datacentre across various global security standards -
  • Ability to migrate or expand your resources from Australia to ‘NZ North’ or other new or existing data centres! Azure is global and gives you the ability to scale your infrastructure to a global market easily or bring your resources closer to home if a data centre becomes available.
  • We all know that despite the best of intentions, we rarely ever test, develop, and improve disaster recovery scenarios, sometimes this is because of the complexity of the applications and backup infrastructure. Azure Site Recovery, Geo-Redundant backup, Load Balancers and automation helps make this a lot easier.
  • Ability to better utilise Cloud security tools (ie such as the Azure Security Center), across Cloud and on-premises workloads consistently using Azure Arc and Azure policies.
  • And finally - more visibility into the true cost and value of your IT infrastructure, the total cost of your IT Infrastructure is hidden behind electricity costs, outages and incidents that would not have impacted cloud resources, slow time to deployment or market, outdated and insecure technologies and most likely services you are running which you don't need to run!

#ProTip - Resources such as the Azure Total Cost of Ownership (TCO) can help you calculate the true cost of your workloads.

· 3 min read

I ran into a weird issue, troubleshooting an 'Always On VPN' installation running off Windows Server 2019, the clients were getting the following error in the Application event log:

Error 809 The network connection between your computer and the VPN server could not be established

In my case, the issue wasn't due to IKEv2 Fragmentation or anything to do with NAT to allow the origin IP to flow to the Always-on VPN server. It was due to the ports being limited to: '2'. I found an old post regarding Windows Server 2008 R2:

"If more than two clients try to connect to the server at the same time, the Routing and Remote Access service rejects the IKEv2 connection requests. Additionally, the following message is logged in the Rastapi.log file:"

This matched my issue; I had never seen more than 2 connections at once.

Increase Ports

  1. Open Routing and Remote Access
  2. Click on your Routing and Remote Access server
  3. Right-click on Ports
  4. Click on: WAN Miniport (IKEv2)
  5. Click Configure
  6. Ensure that: To enable remote access, select Remote access connections (inbound only) is checked.
  7. Change Maximum ports from 2 (as an example) to a number that matches how many connections you want - I went with 128
  8. Click Ok
  9. Click Apply
  10. Restart the Routing and Remote Access server. You should now see more ports listed 'as inactive' until a new session comes along and uses it.

Routing and Remote Access

Routing and Remote Access

Enable TLS 1.1

Although this wasn't my initial fix, I had a Microsoft Support call opened regarding this issue; after analysing the logs, they recommended enabling TLS 1.1 (which was disabled by default on a Windows Server 2019 server). I would only do this as a last resort - if required.

Run the PowerShell script below (as Administrator) to Enable; you can always rerun the Disable script to remove the changes.

Enable TLS 1.1

function enable-tls-1.1
New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -Force
New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -Force
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -name 'Enabled' -value '1' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -name 'DisabledByDefault' -value '0' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -name 'Enabled' -value '1' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -name 'DisabledByDefault' -value '0' –PropertyType 'DWORD'
Write-Host 'Enabling TLSv1.1'

Disable TLS 1.1

function disable-tls-1.1
New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -Force
New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -Force
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -name 'Enabled' -value '0' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server' -name 'DisabledByDefault' -value '1' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -name 'Enabled' -value '0' –PropertyType 'DWORD'
New-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client' -name 'DisabledByDefault' -value '1' –PropertyType 'DWORD'
Write-Host 'Disabling TLSv1.1'

· One min read

Are you attempting to update the Active Directory Schema for LAPS (Local Administrator Password Solution) and keep getting the error below?

Update-AdmPwdAdSchema: The requested attribute does not exist

Here are few things you can check:

  • Make sure you are a Schema Admin
  • Run PowerShell as Administrator
  • Run the PowerShell to update the schema directly from the Schema Master

You can use the snippet below to check which Domain Controller the Schema Master role is running from:

Get-ADDomainController -Filter * | Select-Object Name, Domain, Forest, OperationMasterRoles | Where-Object {$_.OperationMasterRoles}

· 7 min read

AVD-Collect is a handy PowerShell script created by Microsoft Customer Support Services to assist with troubleshooting and resolving issues with Azure Virtual Desktop (and Windows 365), by capturing Logs for analysis (which could then be passed to Microsoft or allow you to delve deeper) and running basic Diagnostics against some common known issues.

You can download this script from:

There is no publically avaliable github repository for it currently, Microsoft will retain the latest version of the script at this link.

This script was NOT created by me and comes 'As/Is', this article is merely intended to share the script to assit others in their AVD troubleshooting.

This script is intended to help support Microsoft Customer Support with assisting customers, but was made publically accessible to assist with MS Support cases and Azure Virtual Desktop diagnostics. No data is automatically uploaded to Microsoft.

Please be aware that the script may change and include new functionality not part of this article, please review the Changelog and Readme of the script directly.

A lot of the information below is contained in the script readme (including a list of the extensive diagnostics and log locations) and changelog; however, I am supplying this article for reference and to help share this nifty tool.

Script pre-requisites

  1. The script must be run with elevated permissions to collect all required data.

  2. All collected data will be archived into a .zip file located in the same folder as the script itself.

  3. As needed, run the script on AVD host VMs and/or Windows-based devices from where you connect to the AVD hosts.

  4. When launched, the script will present the Microsoft Diagnostic Tools End User License Agreement (EULA). You need to accept the EULA before you can continue using the script.

  5. If the script does not start, complaining about execution restrictions, then in an elevated PowerShell console run:

    	Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Force -Scope Process

Acceptance of the EULA will be stored in the registry under HKCU\Software\Microsoft\CESDiagnosticTools, and you will not be prompted again to accept it as long as the registry key is in place. You can also use the "-AcceptEula" command line parameter to accept the EULA silently. This is a per-user setting, so each user running the script will accept the EULA once.

Script scenarios

Core - suitable for troubleshooting issues that do not involve Profiles or Teams or MSIX App Attach

  • Collects core troubleshooting data without including Profiles/FSLogix/OneDrive or Teams or MSIXAA related data
  • Runs Diagnostics.

Core + Profiles - suitable for troubleshooting Profiles issues

  • Collects all Core data
  • Collects Profiles/FSLogix/OneDrive related information, as available
  • Runs Diagnostics.

Core + Teams - suitable for troubleshooting Teams issues

  • Collects all Core data
  • Collects Teams related information, as available
  • Runs Diagnostics.

Core + MSIX App Attach - suitable for troubleshooting MSIX App Attach issues

  • Collects all Core data
  • Collects MSIX App Attach related information, as available
  • Runs Diagnostics.

Core + MSRA - suitable for troubleshooting Remote Assistance issues

  • Collects all Core data
  • Collects Remote Assistance related information, as available
  • Runs Diagnostics.

Extended (all) - suitable for troubleshooting most issues, including Profiles/FSLogix/OneDrive, Teams and MSIX App Attach

  • Collects all Core data
  • Collects Profiles/FSLogix/OneDrive related information, as available
  • Collects Microsoft Teams related information, as available
  • Collects MSIX App Attach related information, as available
  • Runs Diagnostics.


  • Skips all Core/Extended data collection and runs Diagnostics only (regardless of any other parameters that have been specified).

The default scenario is "Core".​​​​​​​

Available command line parameters (to preselect the desired scenario)

  • Core - Collects Core data + Runs Diagnostics
  • Extended - Collects all Core data + Extended (Profiles/FSLogix/OneDrive, Teams, MSIX App Attach) data + Runs Diagnostics
  • Profiles - Collects all Core data + Profiles/FSLogix/OneDrive data + Runs Diagnostics
  • Teams - Collects all Core data + Teams data + Runs Diagnostics
  • MSIXAA - Collects all Core data + MSIX App Attach data + Runs Diagnostics
  • MSRA - Collects all Core data + Remote Assistance data + Runs Diagnostics
  • DiagOnly - The script will skip all data collection and will only run the diagnostics part (even if other parameters have been included).
  • AcceptEula - Silently accepts the Microsoft Diagnostic Tools End User License Agreement.

Usage example with parameters

To collect only Core data (excluding Profiles/FSLogix/OneDrive, Teams, MSIX App Attach):

	.\AVD-Collect.ps1 -Core

To collect Core + Extended data (incl. Profiles/FSLogix/OneDrive, Teams, MSIX App Attach):

	.\AVD-Collect.ps1 -Extended

To collect Core + Profiles + MSIX App Attach data

	.\AVD-Collect.ps1 -Profiles -MSIXAA

To collect Core + Profiles data

	.\AVD-Collect.ps1 -Profiles

​​​​​​​If you are missing any of the data that the script should normally collect, check the content of the "__AVD-Collect-Log.txt" and "__AVD-Collect-Errors.txt" files for more information. Some data may not be present during data collection and thus not picked up by the script.

Execute the script

  1. Download the AVD-Collect script to the session host you need to collect the logs from, if you haven't already.

  2. Extract the script to a folder (i.e. C:\Users\%username&\Downloads\AVD-Collect)

  3. Right-click on: AVD-Collect.ps1, select Properties

  4. Because this file has been downloaded from the Internet, it may be in a protected/block status - select Unblock and click Apply

  5. Open Windows Powershell as Administrator

  6. Now we need to change the directory for where the script is located; in my example, the command I use is:

    cd 'C:\Users\Luke\Downloads\AVD-Collect'
  7. By default, the script will run as 'Core', and I want to include everything, profiles, Teams etc., so run Extended:

    .\AVD-Collect.ps1 -Extended -AcceptEula
  8. Read the notice from the Microsoft Customer Support centre and press 'Y' if you accept to move onto the next steps.

  9. The script will now run:

  10. AVD- Script Running

  11. You will start to see new folders get created in the directory that the script is running from with the extracted log files. The script will take a few minutes to complete as it extracts the logs and then zips them.

  12. Once the script has ran, there will now be a ZIP file of all the Logs collected by the script. In my example, the logs consisted of:

  • Certificates
  • Recent Event Log
  • FSLogix logs
  • Networking
  • Registry Keys
  • Teams information
  • System information
  • Networking and Firewall information
  1. AVD-Collect Logs
  2. If needed, you can now send or upload the ZIP file to Microsoft support. If you are troubleshooting yourself, you can navigate to the folders to look at the specific logs you want, all in one place!
  3. To look at Diagnostic information, open the: AVD-Diag.html file.
  4. You can now see a list of common issues, what the script is looking for, and whether the host has passed or failed these scripts (this can be very useful for Azure Virtual Desktop hosts, to make sure all the standard configuration is done or being applied, including making sure that the session host has access to all the external resources it needs):
  5. AVD-Collect Diagnostics

· 15 min read

WebJEA allows you to build web forms for any PowerShell script dynamically. WebJEA automatically parses the script at page load for description, parameters and validation, then dynamically builds a form to take input and display formatted output!

The main goals for WebJEA:

  • Reduce delegation of privileged access to users
  • Quickly automate on-demand tasks and grant access to less-privileged users
  • Leverage your existing knowledge in PowerShell to build web forms and automate on-demand processes
  • Encourage proper script creation by parsing and honouring advanced function parameters and comments

Because WebJEA is simply a Self-Service Portal for PowerShell scripts, anything you can script with PowerShell you can run through the Portal! Opening a lot of opportunities for automation without having to learn third party automation toolsets! Anyone who knows PowerShell can use it! Each script can be locked down to specific users and AD groups!

You can read more about WebJEA directly on the GitHub page:

This guide will concentrate on setting up WebJEA for self-service Azure VM management. However, WebJEA can be used to enable much more than what this blog article covers, from things such as new user onboarding, to resource creation.

WebJEA - Start/Stop

We will use a Windows Server 2019, running in Microsoft Azure, to run WebJEA from.


  • Domain Joined server running Windows 2016+ Core/Full with PowerShell 5.1
  • The server must have permission to go out over the internet to Azure and download PowerShell modules.
  • CPU/RAM Requirements will depend significantly on your usage, start low (2-vCPU/4GB RAM) and grow as needed.

I've created a Standard_B2ms (2vCPU, 8GB RAM) virtual machine, called: WEBJEA-P01 in an Azure Resource Group called: webjea_prod

This server is running: Windows Server 2019 Datacenter and is part of my Active Directory domain; I've also created a service account called: webjea_services.

Setup WebJEA

Once we have a Windows Server, now it's time to set up WebJEA!

Setup Self-Signed Certificate

If you already have a certificate you can use, skip this step. In the case of this guide, we are going to use a self-signed certificate.

Log into the WebJEA Windows server using your service account (in my case, it is: luke\webjea_services).

Open PowerShell ISE as Administrator, and after replacing the DNS name to suit your own environment, run the following to create the Root CA and Self-Signed certificate:

Now that the Root CA is created and trusted, we want to create the actual self-signed certificate:

#Create RootCA
$rootCA = New-SelfSignedCertificate -Subject "CN=MyRootCA" `
-KeyExportPolicy Exportable `
-KeyUsage CertSign,CRLSign,DigitalSignature `
-KeyLength 2048 `
-KeyUsageProperty All `
-KeyAlgorithm 'RSA' `
-HashAlgorithm 'SHA256' `
-Provider "Microsoft Enhanced RSA and AES Cryptographic Provider" `
-NotAfter (Get-Date).AddYears(10)

#Create Self-Signed Certificate
$cert = New-SelfSignedCertificate -Subject "" `
-Signer $rootCA `
-KeyLength 2048 `
-KeyExportPolicy Exportable `
-DnsName, WEBJEA, WEBJEA-P01 `
-KeyAlgorithm 'RSA' `
-HashAlgorithm 'SHA256' `
-Provider "Microsoft Enhanced RSA and AES Cryptographic Provider" `
-NotAfter (Get-Date).AddYears(10)
$certhumbprint = $cert.Thumbprint

#Add Root CA to Trusted Root Authorities
New-Item -ItemType Directory 'c:\WebJea\certs' -Force
Export-Certificate -Cert $rootCA -FilePath "C:\WebJEA\certs\rootCA.crt" -Force
Import-Certificate -CertStoreLocation 'Cert:\LocalMachine\Root' -FilePath "C:\WebJEA\certs\rootCA.crt"

Write-Host -ForegroundColor Green -Object "Copy this: $certhumbprint - The Thumbprint is needed for the DSCDeploy.ps1 script"

Copy the Thumbprint (if you do this manually, make sure it is the Thumbprint of the certificate, not the Trusted Root CA certificate); we will need that later.

Setup a Group Managed Service Account

This is the account we will use to run WebJEA under; it can be a normal Active Directory user account if you feel more comfortable or want to assign permissions to.

I am using a normal AD (Active Directory) service account in this guide because I am using Microsoft Entra ID Domain Services as my Domain Controller, and GMSA is not currently supported. I have also seen some scripts require the ability to create and read user-specific files. However, it's always good to follow best practices where possible.

Note: Group Managed Services accounts automatically renew and update the passwords for the accounts; they allow for additional security. You can read more about them here: Group Managed Service Accounts Overview.

#Create A group MSA account
Add-kdsrootkey -effectivetime ((get-date).addhours(-10))
New-ADServiceAccount -name webjeagmsa1 -dnshostname (get-addomaincontroller).hostname -principalsallowedtoretrievemanagedpassword

#Create AD Group
New-ADGroup -Name "WebJEAAdmins" -SamAccountName WebJEAAdmins -GroupCategory Security -GroupScope Global -DisplayName "WebJEA - Admins" -Description "Members of this group are WebJEA Admins"

Install-adserviceaccount webjeagmsa1
Add-ADGroupmember -identity "\WebJEAAdmins" -members (get-adserviceaccount webjeagmsa1).distinguishedname

Add the WebJEAAdmins group to the Administrators group of your WebJEA server.

Install WebJEA

Download the latest release package (zip file) onto the WebJEA Windows server

Extract it, and you should have 2 files and 2 folders:

  • Site\

  • StarterFiles\


  • DSCDeploy.ps1

    Open PowerShell ISE as Administrator and open DSCDeploy.ps1

WebJEA uses PowerShell DSC (Desired State Configuration) to set up a lot of the setup.

DSC will do the following for us:

  • Install IIS
  • Create the App Pool and set the identity
  • Create and migrate the Site files to the IIS website folder
  • Configure SSL (if we were using it)
  • Update the WebJEA config files to point towards the script and log locations

Even though most of the work will be automated for us by Desired State Configuration, we have to do some configurations to work in our environment.

I am not using a Group Managed Service Account. Instead, I will use a normal AD account as a service account (i.e. webjea_services), but if you use a GMSA, you need to put the username in the AppPoolUserName; no credentials are needed (make sure the GMSA has access to the server).

Change the following variables to suit your setup; in my case, I have moved WebJEA resources to their own folder, so it's not sitting directly on the OS drive, but until its own Folder.

NodeNameThis is a DSC variable, leave this.
WebAppPoolNameWebApp Pool Name, it may be best to leave this as: WebJEA, however you can change this.
AppPoolUserNameAdd in your GMSA or Domain Service account username
AppPoolPasswordIf using a Domain Account, add the password in here, if GSMA leave bank
WebJEAIISURIThis is the IIS URL, ie server/WebJEA. You can change this if you want.
WebJEAIISFolderIIS folder location, this can be changed if you wanted to move IIS to another drive or location.
WebJEASourceFolderThe source folder, this is the source folder for the WebJEA files when they are first downloaded and extracted (ie Downloads directory)
WebJEAScriptsFolderThis is where the scripts folder will be placed (ie WebJEA installed)
WebJEAConfigPathThis is where the config file will be placed (ie WebJEA installed - it needs to be the same location as the Scripts folder)
WebJEALogPathWebJEA log path
WebJEA_Nlog_LogFileWebJEA system log location
WebJEA_Nlog_UsageFileWebJEA usage log location


One thing to note is that the DSCDeploy.ps1 is calling (dot sourcing) the DSCConfig deploy script; by default, it is looking for it in the same folder as the DSCDeploy.ps1 folder.

If you just opened up PowerShell ISE, you may notice that you are actually in C:\Windows\System32, so it won't be able to find the script to run; you can either change the script to point directly to the file location, or you can change the directory you are into to match the files, in my case in the Script pane I run the following:

cd 'C:\Users\webjea_services\Downloads\webjea-'

Now run the script and wait.

If you get an error saying that the script is not digitally signed, run the following in the script pane:

Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass

This is because the PowerShell execution policy hasn't been set; depending on the scripts you are running, you may have to update the execution policy for the entire system, but for now, we will set it to Bypass for this process only, now re-run the script again, you should see DSC kick-off and start your configuration and setup of IIS and the WebJEA site.


You should also see the files/folders starting to be created!

Note: If you need to make a configuration change, please change it in the DSCDeploy.ps1, DSC will ensure that the configuration is applied as per your configuration and rerun the script, i.e. if you need to replace the certificate from a self-signed certificate to a managed PKI certificate.

Once DSC has been completed, your server should now be running IIS and the WebJEA site

To add the IIS Management Tool, this is not required but will help you manage IIS, run the following PowerShell cmdlet:

Enable-WindowsOptionalFeature -Online -FeatureName IIS-ManagementConsole

Open an Internet Browser and navigate to (your equivalent of):

If you need assistance finding the Website path, open the Internet Information (IIS) Manager, installed and uncollapse Sites, Default WebSite, right-click WebJEA, Manage Application and select Browse.


If successful, you should get a username and password prompt:


That's normal - it means you haven't been given access and now need to configure it.

Configure WebJEA

Now that WebJEA has been set up, it is time to configure it; the first thing we need to do is create a Group for WebJEA admins (see all scripts).

Create an Active Directory group for:

  • WebJEA-Admins
  • WebJEA-Users

Add your account to the: WebJEA-Admins group.

Navigate to your WebJEA scripts folder; in my case, I set it up under c:\WebJEA\Scripts:

WebJEA - Scripts

Before we go any further, take a Backup of the config.json file, rename it to "config.bak".

I recommend using Visual Studio Code to edit the config.json to help avoid any syntax issues.

Now right click config.json and open it to edit

This file is the glue that holds WebJEA together.

We are going to make a few edits:

  • Feel free to update the Title to match your company or Teams
  • Add in the WebJEA-Admins group earlier (include the Domain Name) into the permitted group's session - this controls access for ALL scripts.

Note the: \\ for each path that is required. If you get a syntax error when attempting to load the WebJEA webpage, this is most likely missing.

WebJEA - Demo

Save the config file and relaunch the WebJEA webpage. It should now load without prompting for a username and password.

Set the PowerShell execution policy on the machine to Unrestricted so that you can run any PowerShell scripts on it:

Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope LocalMachine

WebJEA - Demo

If you get an: AuthorizationManager check failed error, it is because the PowerShell scripts are still in a blocked state from being downloaded from the internet, run the following command to unblock them, then refresh the WebJEA webpage:

Get-ChildItem -Path 'C:\WebJEA\scripts\' -Recurse | Unblock-File

You now have a base WebJEA install! By default, WebJEA comes with 2 PowerShell files:

  • overview.ps1
  • validate.ps1

You may have noticed these in the config.json file; WebJEA has actually run the overview.ps1 file as soon as the page loads, so you can have scripts run before running another one, which is handy when you need to know the current state of something before taking action.

The validate.ps1 script is an excellent resource to check out the parameter types used to generate the forms.

Setup Azure Virtual Machine Start/Stop

Now that we have a working WebJEA install, it's time to set up the Azure VM Start/Stop script for this demo.

On the WebJEA server, we need to install the Azure PowerShell modules, run the following in Powershell as Administrator:

Install-Module Az -Scope AllUsers

Create Service Principal

Once the Az PowerShell modules are installed, we need to set a Service Principal for the PowerShell script to connect to Azure to manage our Virtual Machines.

Run the following PowerShell cmdlet to connect to Azure:


Now that we are connected to Azure, we now need to create the SPN, run the following:

$sp = New-AzADServicePrincipal -DisplayName WebJEA-AzureResourceCreator -Role Contributor
$BSTR = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($sp.Secret)
$UnsecureSecret = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($BSTR)

Now you have created an SPN called: WebJEA-AzureResourceCreator. We now need to grab the Tenant ID, run the following:

Get-AzContext | Select-Object Tenant

Now that we have the SPN and Tenant ID, it's time to test connectivity.

# Login using service principal 
$Secret = ConvertTo-SecureString -String 'SECRETSTRINGHERE' -AsPlainText -Force
$Credential = [System.Management.Automation.PSCredential]::New($ApplicationId, $Secret)
Connect-AzAccount -ServicePrincipal -Credential $Credential -TenantId $TenantId

Copy the TenantID into the TenantID section



To retrieve the ApplicationID created from the SPN in the previous step and add it into the ApplicationID part.

Type in:


To retrieve the Secret, created in the SPN and add it to the String.

Now run the snippet, and you should be successfully connected to Azure.

Create Get-VM script

One of the features of WebJEA is the ability to run scripts on page load. So, we will get the current Power State of our Azure VMs, in the WebJEA scripts directory to create a new PS1 file called: Get-VM.ps1.

Add the following script to it:

# Login using service principal 
$Secret = ConvertTo-SecureString -String 'SECRETSTRINGHERE' -AsPlainText -Force
$Credential = [System.Management.Automation.PSCredential]::New($ApplicationId, $Secret)
Connect-AzAccount -ServicePrincipal -Credential $Credential -TenantId $TenantId
Get-AzVM -Status | Select-Object Name, PowerState, ResourceGroupName

Save the file.

Create Set-VM script

Now, it's time to create the Script to Start/Stop the Virtual Machine. In the WebJEA scripts directory, create a new PS1 file called: Set-VM.ps1

Add the following script to it:

[Parameter(Position=1, mandatory=$true,
HelpMessage='What is the name of the Azure Virtual Machine?')]
[Parameter(Position=2, mandatory=$true,
HelpMessage='What is the name of the Azure Resource Group that the Virtual Machine is in?')]
[Parameter(Position=3, mandatory=$true,
HelpMessage='What action do you want to do?')]
# Login using service principal
$Secret = ConvertTo-SecureString -String 'SECRETSTRINGHERE' -AsPlainText -Force
$Credential = [System.Management.Automation.PSCredential]::New($ApplicationId, $Secret)
Connect-AzAccount -ServicePrincipal -Credential $Credential -TenantId $TenantId
Get-AzVM -Status | Select-Object Name, PowerState, ResourceGroupName
if ($VMAction -eq "Start")
Start-AzVM -Name $VMName -ResourceGroupName $RGName -Confirm:$false -Force
elseif ($VMAction -eq "Stop")
Stop-AzVM -Name $VMName -ResourceGroupName $RGName -Confirm:$false -Force

Save the file.

Set VM in WebJEA Config

Now that the scripts have been created, it's time to add them to WebJEA to use.

Navigate to your scripts file and make a backup of the config.json file, then edit: config.json

On the line beneath the "onloadscript": "overview.ps1" file, add:


Then add in:

"id": "StartStopAzVM",
"displayname": "StartStop-AzVM",
"synopsis": "Starts or Stops Azure Based VMs",
"permittedgroups": [".\\Administrators", "\\WebJEAAdmins"],
"script": "Set-VM.ps1",
"onloadscript": "Get-VM.ps1"

So your config.json should look similar to:


"Title": "Luke Web Automation",
"defaultcommandid": "overview",
"basepath": "C:\\WebJEA\\scripts",
"LogParameters": true,
"permittedgroups": [".\\Administrators", "\\WebJEAAdmins"],
"commands": [
"id": "overview",
"displayname": "Overview",
"synopsis": "Congratulations, WebJEA is now working! We've pre-loaded a demo script that will help you verify everything is working. <br/><i>Tip: You can use the synopsis property of default command to display any text you want. Including html.</i>",
"permittedgroups": [".\\Administrators"],
"script": "validate.ps1",
"onloadscript": "overview.ps1"
"id": "StartStopAzVM",
"displayname": "StartStop-AzVM",
"synopsis": "Starts or Stops Azure Based VMs",
"permittedgroups": [".\\Administrators", "\\WebJEAAdmins"],
"script": "Set-VM.ps1",
"onloadscript": "Get-VM.ps1"


Test Azure Virtual Machine Start/Stop

Now that the scripts have been created open the WebJEA webpage.

Click on the StartStop-AzVM page (it may take a few seconds to load, as it is running the Get-VM script). You should be greeted by a window similar to below:

WebJEA - Demo

Congratulations, you have now set up WebJEA and can Start/Stop any Azure Virtual Machines using self-service!

Additional Notes

  • There is room for improvement around error checking, doing more with the scripts, such as sending an email when triggered, etc., to remind the server to be powered off.
  • Because most of the configuration is JSON/PowerShell files, you could have the entire scripts folder in a git repository to make changes, roll back and keep version history.
  • Remove any hard coding of any secrets to connect to Azure (as an example) from the scripts and implement a password management tool with API access or even the Windows Credential Manager. You want a system where you can easily update the passwords of accounts, limit access and prevent anything from being stored in plain text.
  • Using the permitted group's section of the config.json file, you can restrict the ability for certain groups to run scripts this way, and you can set granular control on who can do what.
  • If you use a normal Active Directory user account as the service account - then for added security, make sure that the WebJEA server is the only device that - that account can be logged in as and only has the permissions assigned that it needs to, look at implementing PIM (Privilaged Access Management) for some tasks so it only has access at the time that it needs it.