Skip to main content

Azure Bicep and Insert Resource

· 3 min read

Azure Bicep is a Domain Specific Language (DSL) for deploying Azure resources declaratively. Azure Bicep is a transparent abstraction over ARM and ARM templates, which means anything that can be done in an ARM Template can be done in Bicep.

Azure Bicep has recently (December 2021) been updated to: v0.4.1124, along with various other hotfixes and enhancements; this version supports 'Insert Resource' functionality.

Insert Resource simplifies ARM to Bicep conversion without exporting entire ARM templates, then compiles them to Bicep when you are only after export for a single resource.

To use Insert Resource, you will need to have:

  • Bicep version greater than v0.4.1124
  • Azure CLI
  • Visual Studio Code with the Bicep extension

You can easily install both or upgrade following the Microsoft documentation on the: Install Bicep tools page. You can also review the Bicep changes and latest release notes on Github here: Azure Bicep releases

Import Resources into Bicep using Azure CLI and Bicep

  1. Open a new file in Visual Studio Code
  2. Set the Language mode to Bicep
  3. Visual Studio Code - Bicep
  4. Now we need to login to Azure; in Visual Studio code, click View and Terminal.
  5. In the terminal, type in: az login
  6. Login to Azure using the credentials that have read access to the Resource you want to export.
  7. Once you are logged in, type in: az resource list
  8. In the JSON output in the terminal, copy the resource ID (inside the double quotes from the id value)
  9. Now we need to open the Command Palette, press: CTRL+Shift+P on your keyboard (or click on View, Command Palette)
  10. Start typing in Bicep; if you have the latest version, you should see: Bicep: Insert Resource., select this
  11. Azure Bicep - Insert Resource
  12. Enter in the resource ID you copied earlier.
  13. Azure Bicep - Insert Resource
  14. Azure Bicep should have connected and exported your Resource straight into Bicep! As below, it had imported a Log Analytics workspace in my subscription straight into Bicep.
  15. Azure Bicep - Insert Resource

To find the resource ID using the Azure Portal.

You can use the Azure CLI to find the Resource ID, but you can also use the Azure Portal by navigating to it below:

  1. Log in to the Azure Portal
  2. Navigate to the Resource you want to export to Bicep
  3. On the Overview pane, click on JSON view

I had problems connecting to export an App Service and App Service plan, so for some resources with multiple dependencies, you may be better off exporting the ARM template from the resources/resource groups and decompiling that way, but the Insert Resource functionality is a very quick way to bring your resources into Bicep!

Day in the Life of a Technical Lead

· 4 min read

Being a 'Technical Lead' or 'Tech Lead' means different things to some people and organizations; based on definitions found online, a Technical Lead is:

"A technical lead is a professional who oversees a team of technical personnel at a software or technology company. They often lead software development or software engineering teams and troubleshoot technical issues that involve software development, engineering tasks and product releases."

Although I agree with this, I would flesh out a bit more around architectural governance (or technical assurance, which is what the problem this role or function is for) across it - it also doesn't need to be software development heavy; it can sit in the operational and delivery spaces as well (waterfall or agile) and is more than a specific role, but a frame of mind.

Tech Lead - Venn diagram

At a very high level, this is what a day in the life of a technical lead means to me:

Day in the Life of a Tech Lead

  • Work alongside: Technical Product Owners, Chapter Members, Architecture, Business stakeholders and Service Partners to develop/roadmap/architect and improve technology.
  • Manage delivery and operational risks and dependencies, and remove impediments to the achievement of the team objectives
  • Test and develop roadmaps for preview Cloud capabilities for immediate or future value
  • Act as a Subject Matter Expert (or Consultant) to assist in Design Decisions, Monitoring, Cost and Capacity Requirements
  • Develop Governance processes for onboarding services into BAU, enabling Technology Infrastructure and Operations staff to use technology in a consistent and secure manner
  • Work alongside Security and Developers to enable cross-team visibility and collaboration
  • Champion improvements in People/Processes and ways of working
  • Work alongside Chapter Members and Chapter Lead to develop Training/Skill programs for Technical areas
  • Develop and promote an 'everything as code', 'everything is automated' mindset
  • Problem/Incident Management (i.e. Continous improvement)

A Technical lead mindset may look like below

  • Automate what's trivial, boring, mundane, and belittling
  • Build what you can't buy. Buy what you can't live without
  • Make your work visible. Shift your value to performance.
  • Work is never completed. Establish feedback loops.
  • Target high impact problems.
  • Get out of the way of the work, think outside of the box, don't limit others.
  • Try, Learn, Adapt, Try again
  • Agile is about speed to adapt, not velocity
  • Log what's useful, monitor what matters, alert on what's actionable
  • Empower others while making sure that everything is auditable, standardised.
  • We live in a VUCA (Volatile, Uncertain, Complexity, Ambiguity) world, you will never see perfect.

The views above are my own, but shout out to Teal Unicorn for independent consulting on Ways of Working, Continuous improvement; I attended a few of their workshops on Ways of Working, Ways of Managing and Ways of Consulting, and it helped me take a step back and look at what this kind of mindset may look like, or should be and current blockers.

Overall, I have noticed that Information Technology roles are now blending disciplines that once required specific job roles (ie Business Analyst, Service Delivery Manager, Developer, Architect), although pure technical roles still exist with Cloud technologies, different skillsets are required to get the most value out of technology stacks, as technology becomes more consumable. You may also be interesting in reading my thoughts on: The Cloud Frame of Mind

Hopefully this has helped or at least encouraged looking at problems differently, or areas of improvements for any readers out there!

Azure Storage Account SFTP errors

· 2 min read

As part of standing up and using an Azure Storage account as an SFTP service, I ran into a few issues. This blog post is merely intended to show my findings in case others run into similar issues.

PTY allocation request failed on channel 0

Even though you appear to have connected successfully, you may see the following errors:

  • PTY allocation request failed on channel 0
  • shell request failed on channel 0

You may laugh, but the solution for this was very simple, switch from SSH to SFTP!

If you were like me, I just flicked to SSH as a habit.

Home Directory is not accessible

Make sure that the Home directory (Folder) is created in your container, SFTP won't create this for you.

Also make sure that the Home directory for the user, references Container/Folder, like the below:

Azure Portal - Enable SFTP

Wrong username, authentication failed

When attempting to connect to SFTP using a tool such as WinSCP, I got:

  • Using username "lukeftpuser".
  • Authentication failed.

The username is actually comprised of:

STORAGEACCOUNTNAME+FTPNAME, ie: sftpstorageacc1337.lukeftpuser

WinSCP Connection Azure SFTP

Unable to find Azure Storage SSH Keys

This is not an error, but Azure Keyvault, does not currently support SSH keypairs, so once they are created by Azure, they are stored in a Microsoft.Compute.sshPublicKeys resource found here: SSH Keys

SFTP in Microsoft Azure using Azure Blob Storage

· 11 min read

SSH File Transfer Protocol (SFTP) support is now supported in Preview for Azure Blob Storage accounts with hierarchical namespace enabled.

Although tools such as Storage Explorer, Data Factory, AzCopy allows a copy to and from Azure storage accounts, sometimes your applications need more traditional integration, so SFTP is a welcome addition to the Microsoft Azure ecosystem, which in some cases removes the need for additional Virtual Machine(s).

This support enables standard SFTP connectivity to an Azure Storage account. As an Azure PaaS (Platform as a Service) resource, it offers additional flexibility, reduces operational overhead, and increases redundancy and scalability.

We will run through the initial setup of the Azure Storage account using the Azure Portal.

SFTP using an Azure Storage account does not support shared access signature (SAS) or Microsoft Entra ID (Azure AD) authentication for connecting SFTP clients. Instead, SFTP clients must use a password or a Secure Shell (SSH) private/public keypair.

Before we head into the implementation, just a bit of housekeeping, this is currently still in Preview at the time this post was written; the functionality MAY change by the time it becomes GA (Generally Available).

During the public preview, the use of SFTP does not incur any additional charges. However, the standard transaction, storage, and networking prices for the underlying Azure Data Lake Store Gen2 account still apply. SFTP might incur additional charges when the feature becomes generally available. As of the time of the preview SFTP support is only avaliable in certain regions.

You can connect to the SFTP storage account by using local (to the SFTP storage account) SSH public-private keypair or Password (or both). You can also set up individual HOME directories (because of the hierarchical namespace, these are folders not containers) for each user (maximum 1000 local user accounts).

SFTP Azure Storage Account - High Level Diagram

Creating an Azure Storage account for SFTP

This article assumes you have an Azure subscription and rights to create a new Storage account resource, however if you have an already existing storage account the following pre-requisites are required:

  • A standard general-purpose v2 or premium block blob storage account. You can also enable SFTP as you create the account.
  • The account redundancy option of the storage account is set to either locally-redundant storage (LRS) or zone-redundant storage (ZRS); GRS is not supported.
  • The hierarchical namespace feature of the account must be enabled for existing storage accounts. To enable the hierarchical namespace feature, see Upgrade Azure Blob Storage with Azure Data Lake Storage Gen2 capabilities.
  • If you're connecting from an on-premises network, make sure that your client allows outgoing communication through port 22. The SFTP uses that port.

Fill out the SFTP Public Preview Interest Form

Because the SFTP functionality is currently in Private Preview, Microsoft has asked that anyone interested in the SFTP Preview fill out a Microsoft Forms:

This MAY be required before proceeding to the following steps; initially, I believe this was required - but there appears to have been a few people who I know have registered the feature without the form - either way, the SFTP Public Preview Interest form, is a good opportunity to supply your use-case information to Microsoft directly, to help improve the nature of the service going forward.

Registering the Feature

To create an Azure Storage account that supports SFTP - we need to enable the Preview Feature.

  1. Log in to the Azure Portal

  2. Navigate to: Subscriptions

  3. Select the Subscription that you want to enable SFTP preview for

  4. Click on: Preview features

  5. Search for: SFTP

  6. Click on: SFTP support for Azure Blob Storage and click Register - this may take from minutes to a few days to be registered, as each preview request may need to be manually approved by Microsoft personnel based on the Public Preview Interest form - my feature registration occurred quite quickly, so there is a chance that they either have automated the approvals or I was just lucky.

    As you can see in the screenshot below, I had already registered mine:

  7. Azure Portal SFTP Preview Feature

  8. You can continue to hit refresh until it changes from: Registering to Registered.

  9. While we are here, let's check that the Microsoft.Storage resource provider is registered (it should already be enabled, but it is a good opportunity to check before attempting to create a resource and get a surprise), by clicking on Resource providers in the left-hand side menu and search for: Storage, if it is set to NotRegistered - click on Microsoft.Storage and click Register.

To register the SFTP feature using PowerShell, you can run the following cmdlet:

Register-AzProviderFeature -FeatureName "AllowSFTP" -ProviderNamespace "Microsoft.Storage"

Create the Azure Storage Account

Now that the Preview feature has been registered, we can now create a new Storage account.

  1. Log in to the Azure Portal
  2. Click on +Create a resource
  3. Type in: Storage account and click on the Microsoft Storage account resource and click Create
  4. Azure Portal - Storage account
  5. Select your Subscription you enabled the SFTP feature in earlier
  6. Select your Resource Group (or create a new resource group) to place your storage account into.
  7. Select your storage account name (this needs to be globally unique and a maximum of 24 characters), in my example; I am going with: sftpstorageacc1337
  8. Select your Region; remember that only specific regions currently have SFTP support at the time of this article _.
  9. Select your performance tier; premium is supported but remember to select Blob, select Standard.
  10. Select your Redundancy; remember that GRS-R, GRS isn't supported at this time; I will select Zone-redundant storage (ZRS) so that my storage account is replicated between the three availability zones, but you can also select LRS (Locally Redundant Storage).
  11. Azure Portal - Create v2 Storage Account
  12. Click Next: Advanced
  13. Leave the Security options as-is and check: Enable hierarchical namespace under the Data Lake Storage Gen2 subheading.
  14. Click Enable SFTP
  15. Azure Portal - Enable SFTP
  16. Click: Next: Networking
  17. SFTP supports Private Endpoints (as a blob storage sub-resource), but in this case, I will be keeping Connectivity as a Public endpoint (all networks)
  18. Azure Portal - Enable SFTP
  19. Click Next: Data Protection
  20. Here you can enable soft-delete for your blobs and containers, so if a file is deleted, it is retained for seven days until it's permanently deleted; I am going to leave mine set as the default of 7 days and click: Next: Tags.
  21. Add in any applicable Tags, i.e. who created it, when you created it, what you created it for and click Review + Create
  22. Review your configuration, make sure that Enable SFTP is enabled with Hierarchical namespace and click Create.

In case you are interested in Infrastructure as Code, here is an Azure Bicep file I created to create a storage account ready for SFTP here that can be deployed to a Resource Group, ready for the next steps:

storageaccount.bicep
param storageaccprefix string = ''
var location = resourceGroup().location

resource storageacc 'Microsoft.Storage/storageAccounts@2021-06-01' = {
name: '${storageaccprefix}${uniqueString(resourceGroup().id)}'
location: location
sku: {
name: 'Standard_ZRS'
}
kind: 'StorageV2'
properties: {
defaultToOAuthAuthentication: false
allowCrossTenantReplication: false
minimumTlsVersion: 'TLS1_2'
allowBlobPublicAccess: true
allowSharedKeyAccess: true
isHnsEnabled: true
supportsHttpsTrafficOnly: true
encryption: {
services: {

blob: {
keyType: 'Account'
enabled: true
}
}
keySource: 'Microsoft.Storage'
}
accessTier: 'Hot'
}
}

Setup SFTP

Now that you have a compatible Azure storage account, it is time to enable SFTP!

  1. Log in to the Azure Portal
  2. Navigate to the Storage account you have created for SFTP and click on it
  3. On the Storage account blade, under Settings, you will see: SFTP
  4. Azure Portal - Enable SFTP
  5. Click on SFTP and click + Add local user.
  6. Type in the username of the user you would like to use (remember you can have up to 1000 local users, but there is no integration into Azure AD, Active Directory or other authentication services currently), in my example I will use: lukeftpuser
  7. You can use either (and both) SSH keys or passwords, in this article - I am simply going to use a password so I select: SSH Password.
  8. Click Next
  9. Our storage account is empty, we now need to create a top-level container, so I will sect Create new and set the name to: ftp
  10. I will leave the Public access level to Private (no anonymous access)
  11. Click Ok
  12. Now that the ftp container has been created, we need to set the permissions, I am simply going to give the permissions of Read, Create, Delete, List and Write. It's worth noting, that if you only need to read or list contents, then that is the only permissions you need, these permissions are for the Container, not the folder, so you may find your users may have permissions to other folders in the same Container if not managed appropriately.
  13. Now we set the Home directory. This is the directory that the user will be automatically mapped to, this is optional but if you don't have a Home directory filled in for the user, they will need to connect to the appropriate folders when connecting to SFTP manually. The home directory needs to be relative, ie: ftp/files (the container name and the files folder, located in the ftp container).
  14. Azure Portal - Enable SFTP
  15. Because we specified Password earlier, Azure has automatically created a new password for that account, although you can generate new passwords - you are unable to specify what the Password is, make sure you copy this and store it in a password vault of some kind, the length of the password that was generated for me was: 89 characters.
  16. Azure Portal - Enable SFTP
  17. You should see the connection string of the user, along with the Authentication method and container permissions.
  18. Azure Storage Account SFTP - Local User Created

Test Connectivity via SFTP to an Azure Storage Account

I will test Connectivity to the SFTP Azure Storage account using Windows 11, although the same concepts apply across various operating systems (Linux, OSX, etc.).

Test using SFTP from Windows using command prompt

  1. Make sure you have a copy of the Connection String and user password from the SFTP user account created earlier.

  2. Open Command Prompt

  3. Type in sftp CONNECTIONSTRING, example below and press Enter:

  4. If you get a prompt to verify the authenticity of the host matches (i.e. the name/URL of the storage account matches) and type in: Yes, to add the storage account to your known host's list

  5. Press Enter and paste in the copy of the Password that was generated for you earlier.

  6. You should be connected to the Azure Storage account via SFTP!

  7. As you can see below, I am in the Files folder, which is my users home folder, and there is a file named: Test in it.

  8. SFTP Windows

    Once you have connected to SFTP using the Windows command line you can type in: ?

    That will give you a list of all the available commands to run, ie upload files etc

Test using WinSCP

  1. Make sure you have a copy of the Connection String and user password from the SFTP user account created earlier.
  2. If you haven't already, download WinSCP and install it
  3. You should be greeted by the Login page (but if you aren't, click on Session, New Session)
  4. For the hostname, type in the URL for the storage account (after the @ in the connection string)
  5. For the username, type in everything before the @
  6. Type in your Password
  7. Verify that the port is 22 and file protocol is SFTP and click Login
  8. Azure SFTP - WinSCP
  9. Azure SFTP - WinSCP

Congratulations! You have now created and tested Connectivity to the Azure Storage SFTP service!

Whitelisting your Public IP with Azure Bicep and PowerShell

· 4 min read

Allowing and restricting Azure resources by being accessible by specific Public IP (Internet Protocol) addresses has been around for years; most Azure resources support it, a Storage account is no different.

In this article, I will be using PowerShell to obtain my current public IP, then parse that variable into my Azure Bicep deployment to create a storage account, with the firewall rule allowing ONLY my public IP address.

I will assume that you have both Azure Bicep and PowerShell Azure modules installed and the know-how to connect to Microsoft Azure.

Utilising PowerShell to create dynamic variables in your deployment can open the doors to more flexible deployments, such as including the name of the person deploying the infrastructure into the tags of the resource - or in this case, adding a whitelisted IP automatically to your Azure resource to be secure by default.

I will be using PowerShell splatting as it's easier to edit and display. You can easily take the scripts here to make them your own.

Azure Bicep deployments (like ARM) have the following command: 'TemplateParameterObject'. 'TemplateParameterObject' allows Azure Bicep to accept parameters from PowerShell directly, which can be pretty powerful when used with a self-service portal or pipeline.

Now we are ready to create the Azure Storage account...

I will first make an Azure Resource Group using PowerShell for my storage account first, then use the New-AzResourceGroupDeployment cmdlet to deploy my storage account from my bicep file.

#Connects to Azure
Connect-AzAccount
#Grabs the Public IP of the currently connected PC and adds it into a variable.
$publicip = (Invoke-WebRequest -uri "http://ifconfig.me/ip").Content
#Resource Group Name
$resourcegrpname = 'storage_rg'
#Creates a resource group for the storage account
New-AzResourceGroup -Name $resourcegrpname -Location "AustraliaEast"
# Parameters splat, for Azure Bicep
# Parameter options for the Azure Bicep Template, this is where your Azure Bicep parameters go
$paramObject = @{
'storageaccprefix' = 'stg'
'whitelistpublicip' = $publicip
}
# Parameters for the New-AzResourceGroupDeployment cmdlet goes into.
$parameters = @{
'Name' = 'StorageAccountDeployBase'
'ResourceGroupName' = $resourcegrpname
'TemplateFile' = 'c:\temp\storageaccount.bicep'
'TemplateParameterObject' = $paramObject
'Verbose' = $true
}
#Deploys the Azure Bicep template
New-AzResourceGroupDeployment @parameters

Azure Bicep - Parameter

As you can see above, I am grabbing my current IP Address from the ifconfig website and storing it in a variable (as a string object), then referencing it in the paramObject - which will be passed through to the TemplateParameterObject command as Parameters strings for Azure Bicep, my IP address (I am running this from an Azure VM) is then passed through, to Azure Bicep.

My Azure Bicep is below:

param storageaccprefix string = ''
param whitelistpublicip string = ''
var location = resourceGroup().location

resource storageaccount 'Microsoft.Storage/storageAccounts@2021-06-01' = {
name: '${storageaccprefix}${uniqueString(resourceGroup().id)}'
location: location
sku: {
name: 'Standard_ZRS'
}
kind: 'StorageV2'
properties: {
defaultToOAuthAuthentication: false
allowCrossTenantReplication: false
minimumTlsVersion: 'TLS1_2'
allowBlobPublicAccess: true
allowSharedKeyAccess: true
isHnsEnabled: true
networkAcls: {
resourceAccessRules: []
bypass: 'AzureServices'
virtualNetworkRules: []
ipRules: [
{
value: whitelistpublicip
action: 'Allow'
}
]
defaultAction: 'Deny'
}
supportsHttpsTrafficOnly: true
encryption: {
services: {

blob: {
keyType: 'Account'
enabled: true
}
}
keySource: 'Microsoft.Storage'
}
accessTier: 'Hot'
}
}

In Azure Bicep - I am accepting the whitelistpublicip variable from PowerShell and have passed that along to the virtualNetworkRules object as an Allow, while the defaultAction is 'Deny'.

If I navigate to the Azure Portal, I can see my newly created storage account; under the Networking blade, I can see that the Firewall has been enabled and my Public IP has been added successfully:

Azure Storage Account - Network

Hopefully, this helps you be more secure from deployment time and gives you a good framework to work on; in the future, the same process can be used to create inbound RDP rules for Virtual Machines, as an example.