Skip to main content

Azure Storage Account SFTP errors

· 2 min read

As part of standing up and using an Azure Storage account as an SFTP service, I ran into a few issues. This blog post is merely intended to show my findings in case others run into similar issues.

PTY allocation request failed on channel 0

Even though you appear to have connected successfully, you may see the following errors:

  • PTY allocation request failed on channel 0
  • shell request failed on channel 0

You may laugh, but the solution for this was very simple, switch from SSH to SFTP!

If you were like me, I just flicked to SSH as a habit.

Home Directory is not accessible

Make sure that the Home directory (Folder) is created in your container, SFTP won't create this for you.

Also make sure that the Home directory for the user, references Container/Folder, like the below:

Azure Portal - Enable SFTP

Wrong username, authentication failed

When attempting to connect to SFTP using a tool such as WinSCP, I got:

  • Using username "lukeftpuser".
  • Authentication failed.

The username is actually comprised of:

STORAGEACCOUNTNAME+FTPNAME, ie: sftpstorageacc1337.lukeftpuser

WinSCP Connection Azure SFTP

Unable to find Azure Storage SSH Keys

This is not an error, but Azure Keyvault, does not currently support SSH keypairs, so once they are created by Azure, they are stored in a Microsoft.Compute.sshPublicKeys resource found here: SSH Keys

SFTP in Microsoft Azure using Azure Blob Storage

· 11 min read

SSH File Transfer Protocol (SFTP) support is now supported in Preview for Azure Blob Storage accounts with hierarchical namespace enabled.

Although tools such as Storage Explorer, Data Factory, AzCopy allows a copy to and from Azure storage accounts, sometimes your applications need more traditional integration, so SFTP is a welcome addition to the Microsoft Azure ecosystem, which in some cases removes the need for additional Virtual Machine(s).

This support enables standard SFTP connectivity to an Azure Storage account. As an Azure PaaS (Platform as a Service) resource, it offers additional flexibility, reduces operational overhead, and increases redundancy and scalability.

We will run through the initial setup of the Azure Storage account using the Azure Portal.

SFTP using an Azure Storage account does not support shared access signature (SAS) or Microsoft Entra ID (Azure AD) authentication for connecting SFTP clients. Instead, SFTP clients must use a password or a Secure Shell (SSH) private/public keypair.

Before we head into the implementation, just a bit of housekeeping, this is currently still in Preview at the time this post was written; the functionality MAY change by the time it becomes GA (Generally Available).

During the public preview, the use of SFTP does not incur any additional charges. However, the standard transaction, storage, and networking prices for the underlying Azure Data Lake Store Gen2 account still apply. SFTP might incur additional charges when the feature becomes generally available. As of the time of the preview SFTP support is only avaliable in certain regions.

You can connect to the SFTP storage account by using local (to the SFTP storage account) SSH public-private keypair or Password (or both). You can also set up individual HOME directories (because of the hierarchical namespace, these are folders not containers) for each user (maximum 1000 local user accounts).

SFTP Azure Storage Account - High Level Diagram

Creating an Azure Storage account for SFTP

This article assumes you have an Azure subscription and rights to create a new Storage account resource, however if you have an already existing storage account the following pre-requisites are required:

  • A standard general-purpose v2 or premium block blob storage account. You can also enable SFTP as you create the account.
  • The account redundancy option of the storage account is set to either locally-redundant storage (LRS) or zone-redundant storage (ZRS); GRS is not supported.
  • The hierarchical namespace feature of the account must be enabled for existing storage accounts. To enable the hierarchical namespace feature, see Upgrade Azure Blob Storage with Azure Data Lake Storage Gen2 capabilities.
  • If you're connecting from an on-premises network, make sure that your client allows outgoing communication through port 22. The SFTP uses that port.

Fill out the SFTP Public Preview Interest Form

Because the SFTP functionality is currently in Private Preview, Microsoft has asked that anyone interested in the SFTP Preview fill out a Microsoft Forms:

This MAY be required before proceeding to the following steps; initially, I believe this was required - but there appears to have been a few people who I know have registered the feature without the form - either way, the SFTP Public Preview Interest form, is a good opportunity to supply your use-case information to Microsoft directly, to help improve the nature of the service going forward.

Registering the Feature

To create an Azure Storage account that supports SFTP - we need to enable the Preview Feature.

  1. Log in to the Azure Portal

  2. Navigate to: Subscriptions

  3. Select the Subscription that you want to enable SFTP preview for

  4. Click on: Preview features

  5. Search for: SFTP

  6. Click on: SFTP support for Azure Blob Storage and click Register - this may take from minutes to a few days to be registered, as each preview request may need to be manually approved by Microsoft personnel based on the Public Preview Interest form - my feature registration occurred quite quickly, so there is a chance that they either have automated the approvals or I was just lucky.

    As you can see in the screenshot below, I had already registered mine:

  7. Azure Portal SFTP Preview Feature

  8. You can continue to hit refresh until it changes from: Registering to Registered.

  9. While we are here, let's check that the Microsoft.Storage resource provider is registered (it should already be enabled, but it is a good opportunity to check before attempting to create a resource and get a surprise), by clicking on Resource providers in the left-hand side menu and search for: Storage, if it is set to NotRegistered - click on Microsoft.Storage and click Register.

To register the SFTP feature using PowerShell, you can run the following cmdlet:

Register-AzProviderFeature -FeatureName "AllowSFTP" -ProviderNamespace "Microsoft.Storage"

Create the Azure Storage Account

Now that the Preview feature has been registered, we can now create a new Storage account.

  1. Log in to the Azure Portal
  2. Click on +Create a resource
  3. Type in: Storage account and click on the Microsoft Storage account resource and click Create
  4. Azure Portal - Storage account
  5. Select your Subscription you enabled the SFTP feature in earlier
  6. Select your Resource Group (or create a new resource group) to place your storage account into.
  7. Select your storage account name (this needs to be globally unique and a maximum of 24 characters), in my example; I am going with: sftpstorageacc1337
  8. Select your Region; remember that only specific regions currently have SFTP support at the time of this article _.
  9. Select your performance tier; premium is supported but remember to select Blob, select Standard.
  10. Select your Redundancy; remember that GRS-R, GRS isn't supported at this time; I will select Zone-redundant storage (ZRS) so that my storage account is replicated between the three availability zones, but you can also select LRS (Locally Redundant Storage).
  11. Azure Portal - Create v2 Storage Account
  12. Click Next: Advanced
  13. Leave the Security options as-is and check: Enable hierarchical namespace under the Data Lake Storage Gen2 subheading.
  14. Click Enable SFTP
  15. Azure Portal - Enable SFTP
  16. Click: Next: Networking
  17. SFTP supports Private Endpoints (as a blob storage sub-resource), but in this case, I will be keeping Connectivity as a Public endpoint (all networks)
  18. Azure Portal - Enable SFTP
  19. Click Next: Data Protection
  20. Here you can enable soft-delete for your blobs and containers, so if a file is deleted, it is retained for seven days until it's permanently deleted; I am going to leave mine set as the default of 7 days and click: Next: Tags.
  21. Add in any applicable Tags, i.e. who created it, when you created it, what you created it for and click Review + Create
  22. Review your configuration, make sure that Enable SFTP is enabled with Hierarchical namespace and click Create.

In case you are interested in Infrastructure as Code, here is an Azure Bicep file I created to create a storage account ready for SFTP here that can be deployed to a Resource Group, ready for the next steps:

storageaccount.bicep
param storageaccprefix string = ''
var location = resourceGroup().location

resource storageacc 'Microsoft.Storage/storageAccounts@2021-06-01' = {
name: '${storageaccprefix}${uniqueString(resourceGroup().id)}'
location: location
sku: {
name: 'Standard_ZRS'
}
kind: 'StorageV2'
properties: {
defaultToOAuthAuthentication: false
allowCrossTenantReplication: false
minimumTlsVersion: 'TLS1_2'
allowBlobPublicAccess: true
allowSharedKeyAccess: true
isHnsEnabled: true
supportsHttpsTrafficOnly: true
encryption: {
services: {

blob: {
keyType: 'Account'
enabled: true
}
}
keySource: 'Microsoft.Storage'
}
accessTier: 'Hot'
}
}

Setup SFTP

Now that you have a compatible Azure storage account, it is time to enable SFTP!

  1. Log in to the Azure Portal
  2. Navigate to the Storage account you have created for SFTP and click on it
  3. On the Storage account blade, under Settings, you will see: SFTP
  4. Azure Portal - Enable SFTP
  5. Click on SFTP and click + Add local user.
  6. Type in the username of the user you would like to use (remember you can have up to 1000 local users, but there is no integration into Azure AD, Active Directory or other authentication services currently), in my example I will use: lukeftpuser
  7. You can use either (and both) SSH keys or passwords, in this article - I am simply going to use a password so I select: SSH Password.
  8. Click Next
  9. Our storage account is empty, we now need to create a top-level container, so I will sect Create new and set the name to: ftp
  10. I will leave the Public access level to Private (no anonymous access)
  11. Click Ok
  12. Now that the ftp container has been created, we need to set the permissions, I am simply going to give the permissions of Read, Create, Delete, List and Write. It's worth noting, that if you only need to read or list contents, then that is the only permissions you need, these permissions are for the Container, not the folder, so you may find your users may have permissions to other folders in the same Container if not managed appropriately.
  13. Now we set the Home directory. This is the directory that the user will be automatically mapped to, this is optional but if you don't have a Home directory filled in for the user, they will need to connect to the appropriate folders when connecting to SFTP manually. The home directory needs to be relative, ie: ftp/files (the container name and the files folder, located in the ftp container).
  14. Azure Portal - Enable SFTP
  15. Because we specified Password earlier, Azure has automatically created a new password for that account, although you can generate new passwords - you are unable to specify what the Password is, make sure you copy this and store it in a password vault of some kind, the length of the password that was generated for me was: 89 characters.
  16. Azure Portal - Enable SFTP
  17. You should see the connection string of the user, along with the Authentication method and container permissions.
  18. Azure Storage Account SFTP - Local User Created

Test Connectivity via SFTP to an Azure Storage Account

I will test Connectivity to the SFTP Azure Storage account using Windows 11, although the same concepts apply across various operating systems (Linux, OSX, etc.).

Test using SFTP from Windows using command prompt

  1. Make sure you have a copy of the Connection String and user password from the SFTP user account created earlier.

  2. Open Command Prompt

  3. Type in sftp CONNECTIONSTRING, example below and press Enter:

  4. If you get a prompt to verify the authenticity of the host matches (i.e. the name/URL of the storage account matches) and type in: Yes, to add the storage account to your known host's list

  5. Press Enter and paste in the copy of the Password that was generated for you earlier.

  6. You should be connected to the Azure Storage account via SFTP!

  7. As you can see below, I am in the Files folder, which is my users home folder, and there is a file named: Test in it.

  8. SFTP Windows

    Once you have connected to SFTP using the Windows command line you can type in: ?

    That will give you a list of all the available commands to run, ie upload files etc

Test using WinSCP

  1. Make sure you have a copy of the Connection String and user password from the SFTP user account created earlier.
  2. If you haven't already, download WinSCP and install it
  3. You should be greeted by the Login page (but if you aren't, click on Session, New Session)
  4. For the hostname, type in the URL for the storage account (after the @ in the connection string)
  5. For the username, type in everything before the @
  6. Type in your Password
  7. Verify that the port is 22 and file protocol is SFTP and click Login
  8. Azure SFTP - WinSCP
  9. Azure SFTP - WinSCP

Congratulations! You have now created and tested Connectivity to the Azure Storage SFTP service!

Whitelisting your Public IP with Azure Bicep and PowerShell

· 4 min read

Allowing and restricting Azure resources by being accessible by specific Public IP (Internet Protocol) addresses has been around for years; most Azure resources support it, a Storage account is no different.

In this article, I will be using PowerShell to obtain my current public IP, then parse that variable into my Azure Bicep deployment to create a storage account, with the firewall rule allowing ONLY my public IP address.

I will assume that you have both Azure Bicep and PowerShell Azure modules installed and the know-how to connect to Microsoft Azure.

Utilising PowerShell to create dynamic variables in your deployment can open the doors to more flexible deployments, such as including the name of the person deploying the infrastructure into the tags of the resource - or in this case, adding a whitelisted IP automatically to your Azure resource to be secure by default.

I will be using PowerShell splatting as it's easier to edit and display. You can easily take the scripts here to make them your own.

Azure Bicep deployments (like ARM) have the following command: 'TemplateParameterObject'. 'TemplateParameterObject' allows Azure Bicep to accept parameters from PowerShell directly, which can be pretty powerful when used with a self-service portal or pipeline.

Now we are ready to create the Azure Storage account...

I will first make an Azure Resource Group using PowerShell for my storage account first, then use the New-AzResourceGroupDeployment cmdlet to deploy my storage account from my bicep file.

#Connects to Azure
Connect-AzAccount
#Grabs the Public IP of the currently connected PC and adds it into a variable.
$publicip = (Invoke-WebRequest -uri "http://ifconfig.me/ip").Content
#Resource Group Name
$resourcegrpname = 'storage_rg'
#Creates a resource group for the storage account
New-AzResourceGroup -Name $resourcegrpname -Location "AustraliaEast"
# Parameters splat, for Azure Bicep
# Parameter options for the Azure Bicep Template, this is where your Azure Bicep parameters go
$paramObject = @{
'storageaccprefix' = 'stg'
'whitelistpublicip' = $publicip
}
# Parameters for the New-AzResourceGroupDeployment cmdlet goes into.
$parameters = @{
'Name' = 'StorageAccountDeployBase'
'ResourceGroupName' = $resourcegrpname
'TemplateFile' = 'c:\temp\storageaccount.bicep'
'TemplateParameterObject' = $paramObject
'Verbose' = $true
}
#Deploys the Azure Bicep template
New-AzResourceGroupDeployment @parameters

Azure Bicep - Parameter

As you can see above, I am grabbing my current IP Address from the ifconfig website and storing it in a variable (as a string object), then referencing it in the paramObject - which will be passed through to the TemplateParameterObject command as Parameters strings for Azure Bicep, my IP address (I am running this from an Azure VM) is then passed through, to Azure Bicep.

My Azure Bicep is below:

param storageaccprefix string = ''
param whitelistpublicip string = ''
var location = resourceGroup().location

resource storageaccount 'Microsoft.Storage/storageAccounts@2021-06-01' = {
name: '${storageaccprefix}${uniqueString(resourceGroup().id)}'
location: location
sku: {
name: 'Standard_ZRS'
}
kind: 'StorageV2'
properties: {
defaultToOAuthAuthentication: false
allowCrossTenantReplication: false
minimumTlsVersion: 'TLS1_2'
allowBlobPublicAccess: true
allowSharedKeyAccess: true
isHnsEnabled: true
networkAcls: {
resourceAccessRules: []
bypass: 'AzureServices'
virtualNetworkRules: []
ipRules: [
{
value: whitelistpublicip
action: 'Allow'
}
]
defaultAction: 'Deny'
}
supportsHttpsTrafficOnly: true
encryption: {
services: {

blob: {
keyType: 'Account'
enabled: true
}
}
keySource: 'Microsoft.Storage'
}
accessTier: 'Hot'
}
}

In Azure Bicep - I am accepting the whitelistpublicip variable from PowerShell and have passed that along to the virtualNetworkRules object as an Allow, while the defaultAction is 'Deny'.

If I navigate to the Azure Portal, I can see my newly created storage account; under the Networking blade, I can see that the Firewall has been enabled and my Public IP has been added successfully:

Azure Storage Account - Network

Hopefully, this helps you be more secure from deployment time and gives you a good framework to work on; in the future, the same process can be used to create inbound RDP rules for Virtual Machines, as an example.

Capturing Virtual Machine images and Snapshots in Azure using WVDAdmin

· 7 min read

WVDAdmin - is a native administration GUI (graphical user interface) for Azure Virtual Desktop (AVD). WVDAdmin is a free custom-built tool designed to make managing and standing up Azure Virtual Desktop infrastructure easy. Not only can you use it to roll out your Azure Virtual Desktop infrastructure and manage existing workspaces and host pools - you can use it to create Virtual Machine images that can be used for Virtual Scale Sets, but Base also builds or Azure Virtual Desktop session hosts! In addition, WVDAdmin automates creating and using snapshots and virtual machine images in a simple point and click interface - that just works!

Prerequisites

  • Azure subscription
  • Resource Group
  • Virtual Machine (to be used as your master image)
  • Of course - WVDAdmin

You can download WVDAdmin from the following page: Azure Windows Virtual Desktop administration with WVDAdmin.

Also, make sure you have set up a service principal with the appropriate rights to the Resource Groups that holds your Virtual Machine.

Before proceeding ahead, make sure you have a virtual machine backup!

Capturing a Snapshot

Although, possible to do using the Azure Portal, quickly taking an OS disk snapshot and then reverting the change can be a bit tedious, especially if you want to make a backup quickly of the operating system disk before patching or application upgrade, Snapshots are a lot quicker to take and work well for immediate and temporary recovery, especially when you want to quickly try something out - without having to wait for an Azure Backup. Please note this tool does not snapshot any data drives present.

Capture a Snapshot

  1. Open WVDAdmin
  2. On the "Welcome" tab, enter in your Azure Tenant id
  3. Enter in your Service principal (application) ID and key
  4. Click on Reload all - to connect to Azure
  5. Expand Azure
  6. Expand Virtual Machines
  7. Expand your Resource group; in my example; it is: SERVERS-RG
  8. Right-click your server; in my example, it is: Server2019
  9. Select SnapShot-Create
  10. WVDAdmin - Create Snapshot
  11. WVDAdmin will then prompt you to verify that you want to create your Snapshot.
  12. WVD - Verify Snapshot
  13. Confirm the server is correct and click Ok
  14. Depending on the size of your disk, this process may only take a few seconds; the virtual Machine may experience a slight performance hit. Still, I did not lose RDP connectivity during the snapshot process in my testing.
  15. Review the logs to make sure that the Snapshot has been created successfully:
  16. Snapshot
  17. You should now see the Snapshot in the Azure Portal, in the same Resource Group as the server.
  18. Azure Portal - Snapshot

Restore a Snapshot

Before you proceed, just a warning that restoring the Snapshot will discard any changes made after the Snapshot. The virtual machine will also be deallocated, so it will stop any connections to it.

  1. Open WVDAdmin
  2. On the "Welcome" tab, enter in your Azure Tenant id
  3. Enter in your Service principal (application) ID and key
  4. Click on Reload all - to connect to Azure
  5. Expand Azure
  6. Expand Virtual Machines
  7. Expand your Resource group; in my example; it is: SERVERS-RG
  8. Right-click your server; in my example, it is: Server2019
  9. Select SnapShot-Restore
  10. Azure Disk Snapshot
  11. Select the Snapshot you would like to restore to, and when you are ready, click Ok. This will force the Virtual Machine to be shut down and deallocated and the Snapshot to be restored.
  12. Azure Disk Snapshot
  13. You may also start the VM from WVDAdmin, by right-clicking on the Virtual Server after the Snapshot restores and click: Start.
  14. Azure Disk Snapshot
  15. Verify that your Virtual Machine is back up and running and remove any unneeded snapshots and disks from the Azure Portal, to reduce additional costs. If you intend to keep any around, make sure you add appropriate Tags and a review date so you know what and why they existed in the first place.

A few things to note:

  • WVDAdmin gave me errors, stating that the "Recovering snapshot was not successful", however, this occurred after the Swapping disk process when the old disk was attempting to be deleted. The recovery did, in fact, reoccur; I then successfully deleted the disks in the Azure Portal manually.
  • I also had the: "Virtual machine agent status is not ready." error occur. After the Virtual Machine had enough time to start the Azure agent, this self-resolved.

Capturing a Virtual Machine Image

Virtual Machine images work well for Azure Virtual Desktop and Virtual Machines scale sets, where you want consistency between your various virtual machines. The same process I will run through works with Windows 10/11 along with Windows Server 2022 and below (and I would also imagine Linux workloads).

I will be using the Windows Server 2019 Virtual Machine I had created before, however with various applications that I want to be standard across new builds; in my demo I used chocolatey to install:

  • Adobe Reader
  • Microsoft Visual C++ runtimes
  • 7Zip
  • VLC

Then added a custom user policy to set the wallpaper. WVDAdmin will automatically generalise (sysprep) the Machine for you by creating a 'Temp' machine without touching your original Virtual Machine!

Capture a Virtual Machine Image

  1. Open WVDAdmin
  2. On the "Welcome" tab, enter in your Azure Tenant id
  3. Enter in your Service principal (application) ID and key
  4. Click on Reload all - to connect to Azure
  5. Expand Azure
  6. Expand Virtual Machines
  7. Expand your Resource group; in my example, it is: SERVERS-RG
  8. Right-click your server; in my example, it is: Server2019
  9. Select Create a template image
  10. WVDAdmin - Create a template image
  11. WVDAdmin will then display the: Capture Image tab.
  12. Type in an appropriate image name (make sure you understand it, add specific versioning etc.)
  13. Verify that your Template VM is correct
  14. Select your Target Resource Group for your Image
  15. If you have a custom PowerShell script, you may add additional customisations. Add the script path here (make sure it's publically accessible by Azure, i.e. Azure storage account, Github repository etc.).
  16. Before proceeding to the next step, your VM will be deallocated
  17. When you are ready, select Capture
  18. WVDAdmin will then deallocate your VM and run through the following process:
  19. Deallocate VM -> Create a snapshot of VM ->Create a temporary VM from the snapshot -> Generalise the VM -> deallocate temporary VM -> create the image -> delete temp VM resources
  20. WVDAdmin - Capture Image
  21. You should now see your Image in your Azure Portal.
  22. Azure - Custom Image
  23. You can now create additional Virtual Machines from your Custom image using the Azure Portal.
  24. WVDAdmin can also copy your Custom Image into a Shared Image Gallery, or you can use it to create an Azure Virtual Desktop session host!
  25. WVDAdmin - New Session Host

Hopefully, this article has been of some use - even if you don't use Azure Virtual Desktop - WVDAdmin is a great tool to help with day-to-day Azure Virtual Machine operations.

Cloud Adoption Framework for Azure - Tools and Templates

· 3 min read

To help with your Microsoft Cloud Adoption and Azure migration, you need a few things to be successful:

  1. Define your strategy, what are your expected outcomes? Where do you start, what skills do you have or need?
  2. Plan, this may include organisational alignment to get moving to the Cloud
  3. Ready, this is where you look at your governance, Landing Zones /Blueprints
  4. Adopt, this is where you actually migrate your workloads into the cloud, existing apps and new

Cloud Adoption Framework for Azure

Here are some useful tools, templates, and assessments provided by Microsoft to help on your journey:

Note: It is not as if you can't get these resources elsewhere, I purely just wanted a list format for easy reference.

Define strategy

Plan

Ready

Adopt

Govern

Manage

The Microsoft Cloud Adoption Framework page has everything listed above and more! If you are serious about Cloud Adoption, then reading through the official documentation not only gives you better context to the resources linked to this page but gives you more ways to think about potential opportunities to help with your Cloud adoption!

Most of these tools can be found directly in the public Cloud Adoption Framework GitHub repository: microsoft/CloudAdoptionFramework so keep an eye on that!

Make sure you follow the Cloud Adoption Framework - Whats new page, to keep up with the current best practices