Deploy Azure Bicep (ARM Templates) to Multiple Environments using Azure DevOps Pipelines

This post looks at deploying resources defined using Azure Bicep to multiple environments using Azure DevOps Pipeliens

Previously I’ve posted on developing Azure Bicep code using Visual Studio Code and on how to use an Azure DevOps Pipeline to deploy bicep code to Azure. In this post we’re going to go one step further and look at deploying resources defined using Azure Bicep to multiple environments. Our goal is to fully automate this process, so we’re going to leverage Azure DevOps Pipelines to define a build and release process.

Let’s summarise the goals:

  • Azure resources to be defined using Azure Bicep
  • Build and Release process to be defined in yaml in Azure DevOps Pipelines
  • Build process to be triggered whenever the bicep code changes
  • Build process should output the compiled ARM template as an artifact
  • Release process needs to support multiple environments
  • Each environment should use a different resource group
  • Release process should gate the release to particular environments based on a mix of branch and approval gates

You might think these goals are too hard to achieve using a single build and release process but the good news is that most of the heavy lifting is already done by Azure DevOps.

Here’s a quick summary of what we needs to setup:

  1. Azure Bicep file – this will define the resource(s) that we’re going to deploy to each environment
  2. Azure DevOps Environments – this will define the different environments we’re going to deploy to. At this stage this is limited to defining the approvals and checks that will be done before code/resources are deployed to an environment
  3. Azure DevOps Variable Groups – this will define variables that are used across all build and deployment steps, as well as variables that are specific to individual environments
  4. Azure DevOps Pipeline – this will define the actual build and release process for the bicep code.

Defining Resources Using Azure Bicep

Let’s start by defining a very simple Azure Bicep (services.bicep) that defines a storage account.

/* 
******************  
    Literals
****************** 
*/

// Resource type prefixes
var storageAccountPrefix = 'st'

/* 
******************  
    Parameters
****************** 
*/

// ** General
param applicationName string 
param location string 
param env string 

/* 
******************  
    Resources
****************** 
*/

var appNameEnvLocationSuffix  = '${applicationName}${env}'

// Storage Account

var storageAccountName  = '${storageAccountPrefix}${appNameEnvLocationSuffix}' 

resource attachmentStorage 'Microsoft.Storage/[email protected]' = {
    name: storageAccountName
    location: location
    sku: {
        name: 'Standard_LRS'
        tier: 'Standard'
    }
    kind: 'StorageV2'
    properties: {
        accessTier: 'Hot'
        minimumTlsVersion: 'TLS1_2'
        supportsHttpsTrafficOnly: true
        allowBlobPublicAccess: true
        networkAcls: {
            bypass: 'AzureServices'
            defaultAction: 'Allow'
            ipRules: []
        }
    }
}

A couple of things to note about this bicep file

  • It defines a constant, storageAccountPrefix, that is used to define the full name of the generated resource. This prefix comes from the list of recommended resource-type prefixes that the Microsoft documentation lists
  • There are three parameters defined: applicationName, location and env. Location is used to define the region that the resource will be created (alternatively you could use resourceGroup().location to use the same location as the resource group where this resource is being created). The applicationName and env parameters are combined with the storageAccountPrefix to define a unique name for the resource being created.
  • All three parameters are required, meaning that they will need to be supplied when deploying the generated ARM template. To simplify the process you may want to define default values for applicationName and location. It’s important that you supply the env value during deployment to ensure the resources in each resource group are unique across your subscription.

Setting Up Multiple Environments

Next we’ll define the different environments in Azure DevOps. We’ll keep things relatively simple and define three environments:

  1. Development – from the develop branch
  2. Testing – from the release branch
  3. Production – from the release branch but requiring approval

To create these environments, click on the Environments node under Pipelines from the navigation tree on the left side of the Azure DevOps portal for the project. Click the New environment button and enter a Name and Description for the each environment.

Branch Control

For each environment we need to limit deployments so that only code from the appropriate branch can be deployed to the environment. To setup a branch control check for an environment you first need to open the environment by selecting it from the list of Environments. From the dropdown menu in the top right corner, select Approvals and checks. Next, click the Add check (+) button. Select Branch control and then click the Next button.

In the Branch control dialog we need to supply the name of the branch that we want to limit deployments from. For example for the Development environment we would restrict the Allowed branches to the develop branch (specified here as refs/heads/develop).

Note here that we’ve also included refs/tags/* in the list of Allowed branches. This is required so that we can use the bicep template from the Pipeline Templates repository. I’d love to know if there’s a way to restrict this check to only a specific repository, since adding refs/tags to the Allowed branches will mean that any tagged branch in my repository will also be approved. Let me know in the comments if you know of a workaround for this.

The Testing and Production environments are both going to be restricted to the Release branch. However, we’re also going to enforce a check on the branch to Verify branch protection.

What this means is that the Release branch will be checked to ensure branch policies are in place. For example the Release branch requires at least one reviewer and a linked work item.

Approvals

The only difference between the Testing and Production environments is that deployment to the Production environment requires a manual approval. This makes sense in most cases considering you may need to co-ordinate this with a marketing announcement or a notification to existing customers.

Setting up an approval gate starts similar to adding branch control. Open the environment and go to Approvals and checks. Click the Add check button and select Approvals. In the Approvals dialog, enter the list of users that can approve the deployment, adjust any of the other properties and then click Create.

We typically set a very short approval timeout period to avoid the scenario where multiple deployments get queued up behind each other. If we create another deployment before the first has been approve, we prefer to only have the latter deployment pushed to the Production environment. Of course, this is something your team should discuss and agree on what strategy you want to employ.

Environment Variable Groups

Since we’re going to be deploying the same set of resources to each of the environments, we’re going to need a way to specify build and release variables, some that are common across all stages of the pipeline, and some that are specific to each environment. To make it easy to manage the variables used in the pipeline we’re going to use variable groups which can be defined within the Library tab within Azure DevOps Pipelines.

Common Build Variables

We’ll create a variable group called Common Build Variables and we’ll add two properties, ResourceGroupLocation and AzureSubscriptionConnectionName.

As you can probably deduce, the ResourceGroupLocation will be the region where all the resources will be created. For simplicity this will assumed to be the same across all environments. The AzureSubscriptionConnectionName we’ll come back to but needless to say, it’s the same connection for all stages in the pipeline.

Environment Specific Variables

For each environment we’re going to define a variable group that is named Common.[enviornment]. These variable groups will contain variables that are specific to each environment. In this case, we’re going to define the EnvironmentName, the EnvironmentCode and ResourceGroupName

We’ll show these variable in action shortly but it’s important to remember that any variable that needs to vary, based on which Enviornment it’s being deployed to, should be defined in the appropriate variable group.

Build and Release Process

The build and release process is going to be defined as a yaml pipeline in Azure DevOps.

Service Connections

Before we can jump in to write some yaml, we need to setup a couple of service connections.

  • Pipeline-Templates – this is a github service connection so that the build process can download the appropriate pipeline template to assist with the compilation of the bicep file.
  • Azure-Subscription – this is a link to the Azure subscription where the resource groups will be created and subsequently the resources created.

I’m not going to step through process of creating these connections, since you can simply follow the prompts provided in the Azure portal. However, it’s important to take note of the name of the service connections.

Build and Deploy Process

Here’s the full build and release pipeline, which we’ll step through in more detail below.

trigger:
  branches:
    include:
    - '*'  # must quote since "*" is a YAML reserved character; we want a string
  paths:
    include:
    - azure/services.bicep
    - pipelines/azure-services.yml
  
resources:
  repositories:
    - repository: pipelinetemplates
      type: github
      name: builttoroam/pipeline_templates
      ref: refs/tags/v0.7.0
      endpoint: Pipeline-Templates
  
pool:
  vmImage: 'windows-latest'
  
variables:
  - group: 'Common Build Variables'
  - name: application_name
    value: inspect
  - name: bicep_filepath
    value: 'azure/services.bicep'
  - name: arm_template_filepath
    value: '$(Pipeline.Workspace)/BicepArtifacts/services.json'
  
stages:
- stage: Compile_Bicep
  pool:
    vmImage: 'windows-latest'

  jobs:
  - job: Bicep
    steps:
      - template: azure/steps/bicep/[email protected]
        parameters:
          name: Bicep
          bicep_file_path: '$(System.DefaultWorkingDirectory)/$(bicep_filepath)'
          arm_path_variable: ArmFilePath

      - task: [email protected]2
        displayName: 'Copying bicep file to artifacts folder'
        inputs:
          contents: '$(Bicep.ArmFilePath)'
          targetFolder: '$(build.artifactStagingDirectory)'
          flattenFolders: true
          overWrite: true

      - task: [email protected]1
        displayName: 'Publish artifacts'
        inputs:
          pathtoPublish: '$(build.artifactStagingDirectory)' 
          artifactName: 'BicepArtifacts' 
          publishLocation: Container


- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Development'
    depends_on: 'Compile_Bicep'
    deploy_environment: 'Development'

- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Testing'
    depends_on: 'Deploy_Development'
    deploy_environment: 'Testing'
  
- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Production'
    depends_on: 'Deploy_Testing'
    deploy_environment: 'Production'

Trigger – we’ve setup this process to kick off whenever code is committed to the develop environment.

Resources – this process leverages the templates from PipelineTemplates, which requires the definition of a resource pointing to the appropriate tagged release in the pipeline templates github repository.

Stages – there’s one build stage, followed by three release (aka deploy) stages. The steps for the build stage leverage the bicep template from pipeline templates, in order to generate the ARM template. The ARM template is then executed in each of the environments and deployed to the appropriate resource group.

The three deploy stages all used the same template, that we’ll see in a minute, coupled with an environment specific parameter value. For example the first stage passes in a parameter ‘Development’.

Deploy Stage Template

As I mentioned, the three deployment stages are identical, except for some parameter values that are defined for each environment. In this case the template referenced for each stage includes creating the resource group and then planting it.

parameters:
- name: stage_name
  type: string
  default: 'Deploy_ARM_Resources'

- name: depends_on
  type: string
  default: ''

  # deploy_environment - Environment code
- name: deploy_environment
  type: string

stages:
- stage: ${{ parameters.stage_name }}
  dependsOn: ${{ parameters.depends_on }}
  variables:
  - group: 'Common.${{ parameters.deploy_environment }}'
  
  pool:
    vmImage: 'windows-latest'

  jobs:
  - deployment: 'Deploy${{ parameters.stage_name }}'
    displayName: 'Deploy ARM Resources to ${{ parameters.deploy_environment }}' 
    environment: ${{ parameters.deploy_environment }}
    strategy:
      runOnce:
        deploy:
          steps:
          - task: [email protected]2
            name: ${{ parameters.stage_name }}
            inputs:
              targetType: 'inline'
              workingDirectory: $(Pipeline.Workspace)
              script: |
                  $envParam = '${{ parameters.deploy_environment }}'
                  Write-Host "Deployment deploy environment parameter: $envParam"

                  $envName = '$(EnvironmentName)'
                  Write-Host "Deployment environment name variable: $envName"

          - task: [email protected]2
            displayName: 'Create resource group - $(ResourceGroupName)'
            inputs:
              azureSubscription: $(AzureSubscriptionConnectionName)
              scriptType: ps
              scriptLocation: inlineScript
              inlineScript: |
                Write-Host "Creating RG: $(ResourceGroupName)"
                az group create -n $(ResourceGroupName) -l $(ResourceGroupLocation)
                Write-Host "Created RG: $(ResourceGroupName)"

          - task: [email protected]2
            displayName: 'Deploying ARM template to $(ResourceGroupName)'
            inputs:
              azureSubscription: $(AzureSubscriptionConnectionName)
              action: 'Create Or Update Resource Group' 
              resourceGroupName: $(ResourceGroupName)
              location: $(ResourceGroupLocation) 
              templateLocation: 'Linked artifact'
              csmFile: '$(arm_template_filepath)' # Required when  TemplateLocation == Linked Artifact        
              overrideParameters: '-location $(ResourceGroupLocation) -env $(EnvironmentCode) -applicationName $(application_name)'

Throughout this pipeline, there are various variables referenced. The important thing to note is that the variables need to exist for each environment, or are environment independent.

The Common Build Variables group was imported in the build and release process yaml file. The ResourceGroupLocation is the only variable from this group that’s used within this template.

The variable group for each environment is imported within the deploy stage template. The environment name, which is passed in as the deploy_environment parameter, is concatenated with “Common.” in order to import the correct variable group. The imported variables can be referenced the same way as other locally defined variables.

The deployment pipeline template has three steps: The first simply outputs variables so that it’s clear what environment is being built. Then there’s a task for creating the resource group, and then lastly a task for deploying the ARM template.

Running the Pipeline End to End

In the post we’ve defined three different environment and configured Azure DevOps to have different variable groups for each of the environments. Here you can see an execution of the pipeline with the build stage, followed by three deployment stages.

In this case note the Testing deployment failed because the resources were being deployed from the wrong branch (develop instead of release). Unfortunately because this one stage failed, the entire pipeline was marked as failed.

Building a Bicep from an ARM Template with Parameters

As I start to work with Azure Bicep I figured I’d share some of my experiences here. First up, is just a quick recap of how to generate bicep code from an ARM template. Actually, the problem I initially started with was how to start writing bicep code since I wasn’t really familiar with either … Continue reading “Building a Bicep from an ARM Template with Parameters”

As I start to work with Azure Bicep I figured I’d share some of my experiences here. First up, is just a quick recap of how to generate bicep code from an ARM template.

Actually, the problem I initially started with was how to start writing bicep code since I wasn’t really familiar with either the bicep syntax, or how to define each of the resources I wanted to create. I had skimmed through the docs that the team have put together, which helped my understand the basic syntax. Then I took a look through the various examples that the team had posted. In fact, they even have an interactive playground where you can enter bicep code and have it generate the corresponding ARM template. In the top right corner, they have a list of sample templates, along with the corresponding bicep code.

Now that I’d brushed up on the syntax and had trawled through a dozen or so of the examples, I figured I was ready to write my first bicep file. Hold up…. before I get into writing some bicep code, I actually decided to make sure I could build and deploy my bicep code locally (if you want to build and run your bicep code in an Azure DevOps pipeline, check out this post).

Build and Deploy Bicep Code

Here’s a quick summary of working with Visual Studio Code to write, compile and deploy your Bicep code.

  • Install latest Bicep version (v0.1.37-alpha at time of writing) – I’d recommend using the installer as this will correctly setup path variables etc
  • Install the Visual Studio Code extension – this isn’t available via the extensions marketplace, so you’ll need to download it from the release page on github. Make sure you follow the instructions to install the extension by selecting the downloaded VSIX from within Visual Studio Code (rather than double-clicking the downloaded file)
  • Launch Visual Studio Code and create your first, empty, bicep file eg myfirstbicep.bicep. Note that Visual Studio Code will recognise the file type and show “Bicep” as the language in the bottom right corner of the window.
  • Make sure you have the Azure CLI Tools extension installed for Visual Studio Code
  • Create an azcli file, eg myfirstbicep.azcli that will be used to run Azure CLI commands. Alternatively you can just enter the commands into a Powershell terminal (assuming you have the Azure CLI installed).
  • Add the following code to the azcli file – comments should explain what each line does.
# generates myfirstbicep.json (ie compiles the bicep file to ARM template)
bicep build myfirstbicep.bicep 

# create a resource group to deploy the bicep code to
# this is ignored if resource group already exists
az group create -n rgbicepexample -l westus 

# deploy the ARM template that was generated from bicep file
az deployment group create -f myfirstbicep.json -g rgbicepexample

# delete the resource group
# this will prompt to confirm deletion
az group delete -n rgbicepexample
  • Execute each line in the azcli file by right-clicking and selecting Run Line in Terminal

If you run each of the lines you should see output similar to the following

In this scenario I ran the final command to clean up the resource group. If you’re in the process of writing your bicep code, chances are that you’ll simply create the resource group once and then keep compiling the bicep code to the ARM template (first command in the azcli file) and deploying the updated ARM template.

First Bicep Resource

Now that we have the tooling setup so that we can compile and deploy our bicep file, it’s time to dig in and start to create resources. Despite having some examples to follow, I was still a bit bewildered by not knowing what options I needed to include for any given resource. I decided to take a rather pragmatic approach by using the Azure portal to generate resources, export the ARM template and then convert that to bicep code. The last step, as you’ll see, is very manual and repetitive – hopefully this will get easier when this GitHub issue is addressed, providing us with a tool to at least automatically generate bicep code from an ARM template.

Let’s step through this process and I’ll point out a couple of things along the way. I’ll create a Storage Account that can be deployed to a resource group. This should be enough to give you a flavour for the approach.

  • Start by invoking the “az group create” command in Visual Studio code to make sure you have a resource group to work with. Alternatively you can work with any existing resource group if you’d prefer (I tend to avoid doing this to ensure I don’t accidentally overwrite, or delete, other resources I might have)
  • Head to the Azure Portal and open the resource group. Click the Add button to launch the New resource wizard.
  • Search for Storage Account and provide the necessary details to create a Storage Account.
  • I’m not going to go through the various settings here but once you get to the Review + Create tab, you’ll see that there is a link at the bottom to Download a template for automation.
  • When you click through to the template, you’ll see that it’s nicely presented with a tree view to allow for each navigation around the ARM template
  • Rather than simply copying the entire ARM template into our bicep file (and then having to change the syntax from ARM to Bicep), we’re going to do this in steps. We’re going to start with the parameters but instead of using the Parameters in the ARM template, we’re going to grab the json from the Parameters tab.
{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "location": {
            "value": "westus2"
        },
        "storageAccountName": {
            "value": "stbicepexample"
        },
        "accountType": {
            "value": "Standard_RAGRS"
        },
        "kind": {
            "value": "StorageV2"
        },
        "accessTier": {
            "value": "Hot"
        },
        "minimumTlsVersion": {
            "value": "TLS1_2"
        },
        "supportsHttpsTrafficOnly": {
            "value": true
        },
        "allowBlobPublicAccess": {
            "value": true
        },
        "networkAclsBypass": {
            "value": "AzureServices"
        },
        "networkAclsDefaultAction": {
            "value": "Allow"
        }
    }
}
  • Copy the json for the parameters into the bicep file and then start trimming the bits we don’t need – remember the bicep syntax is more succinct, so in a lot of cases the change to syntax is simply removing braces and the verbose ARM template code.

From the parameters json, the main things we need are the parameters and their default values, everything else can go. You’ll need to insert the keyword “param” and the type for each parameter – most are obvious from the default value but if in any doubt, you can go back to the ARM template and look in the parameters list. You’ll also need to change all strings from using ” to using ‘. What we’re left with is a very compact list of parameters, with their default values.

param location string = 'westus2'
param storageAccountName string = 'stbicepexample'
param accountType string = 'Standard_RAGRS'
param kind string = 'StorageV2'
param accessTier string = 'Hot'
param minimumTlsVersion string = 'TLS1_2'
param supportsHttpsTrafficOnly bool = true
param allowBlobPublicAccess bool = true
param networkAclsBypass string = 'AzureServices'
param networkAclsDefaultAction string = 'Allow'

By providing default values for each of the parameter, we’re making them optional – if a parameter value is provided as part of running the deployment, it will be used, otherwise the default value will be used. If you want to require a parameter to be provided as part of the deployment, simply remove the default value (eg change “param location string = ‘westus2′” to just “param location string”)

  • Next up is to copy across the json from the ARM template for the resource itself. I simply grab the resources block of json
    "resources": [
        {
            "name": "[parameters('storageAccountName')]",
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2019-06-01",
            "location": "[parameters('location')]",
            "properties": {
                "accessTier": "[parameters('accessTier')]",
                "minimumTlsVersion": "[parameters('minimumTlsVersion')]",
                "supportsHttpsTrafficOnly": "[parameters('supportsHttpsTrafficOnly')]",
                "allowBlobPublicAccess": "[parameters('allowBlobPublicAccess')]",
                "networkAcls": {
                    "bypass": "[parameters('networkAclsBypass')]",
                    "defaultAction": "[parameters('networkAclsDefaultAction')]",
                    "ipRules": []
                }
            },
            "dependsOn": [],
            "sku": {
                "name": "[parameters('accountType')]"
            },
            "kind": "[parameters('kind')]",
            "tags": {}
        }
    ],
  • Converting this code is a little trickier but essentially, you need to start with declaring the resource which will look like the following, where the [type] and [apiVersion] need to be replaced by the values from the ARM template.
resource myStorage '[type]@[apiVersion]' = {
    properties: {
    }
}

Essentially the resource declaration is just a key-value pair object graph. To covert the ARM template to the corresponding object graph you mainly just need to remove the parenthesis and trailing commas. You’ll also need to replace the parameter references with a simple reference to one of the param variables we declared earlier (eg “parameters(‘kind’)” becomes just kind). The converted resource (including the param lines we converted earlier) then looks like.

param location string = 'westus2'
param storageAccountName string = 'stbicepexample'
param accountType string = 'Standard_RAGRS'
param kind string = 'StorageV2'
param accessTier string = 'Hot'
param minimumTlsVersion string = 'TLS1_2'
param supportsHttpsTrafficOnly bool = true
param allowBlobPublicAccess bool = true
param networkAclsBypass string = 'AzureServices'
param networkAclsDefaultAction string = 'Allow'


resource myStorage 'Microsoft.Storage/[email protected]' = {
    name: storageAccountName
    location: location
    sku: {
        name: accountType
    }
    kind: kind
    properties: {
        accessTier: accessTier
        minimumTlsVersion: minimumTlsVersion
        supportsHttpsTrafficOnly: supportsHttpsTrafficOnly
        allowBlobPublicAccess: allowBlobPublicAccess
        networkAcls: {
            bypass: networkAclsBypass
            defaultAction: networkAclsDefaultAction
        }
    }
}

This bicep code is good to go – you can compile and deploy this to your resource group. Don’t forget, once your code is finished and you’ve tested it locally, make sure you commit it to Azure DevOps and deploy it to Azure as part of a CI/CD process.

Important Note: A lot of Azure resources are pay for use, so you won’t rack up the dollars just by creating resources. However, there are some resources that will start to cost you as soon as they’re created. I would highly recommend deleting your development resource group whenever you’re done for the day, that way you can be sure you’re not going to continue to be charged.

Pipeline Templates: Using Project Bicep with Azure DevOps Pipeline to Build and Deploy an ARM Templates

This post covers how you can use a .bicep file in your Azure DevOps pipeline to deploy resources to Azure via an ARM template

It’s great to see that Microsoft listens to the pain that developers and devops professionals go through in working with Azure. Specifically that ARM templates, whilst very powerful, are insanely verbose and cumbersome to work with. Hence Project Bicep which is at it’s core a DSL for generating ARM templates. In this post I’m not going to go into much details on how to work with .bicep files but rather on how you can use a .bicep file in your Azure DevOps pipeline to deploy resources to Azure.

If you are interested in learning more about Project Bicep and the .bicep file format, here are some posts that provide some introductory material:

In order to deploy a .bicep file to Azure, you first need to use the bicep command line to generate the corresponding ARM template. You can then deploy the ARM template to Azure. Rather than having to add multiple steps to your build pipeline, wouldn’t it be nice to have a single step that you can use that simply takes a .bicep file, some parameters and deploys it to Azure. Enter the bicep-run template that is part of the 0.7.0 release of Pipeline Templates.

If you haven’t worked with any of the templates from the Pipeline Templates project, here’s the quick getting started:

Add the following to the top of your pipeline – this defines an external repository called all_templates that can be referenced in your pipeline.

resources:
  repositories:
    - repository: all_templates
      type: github
      name: builttoroam/pipeline_templates
      ref: refs/tags/v0.7.0
      endpoint: github_connection

Next, we’re going to use the bicep-run template to deploy our .bicep file to Azure.

      - template: ../../steps/bicep/bicep-run.yml
        parameters:
          name: BicepRun
          az_service_connection: $(service_connection)
          az_resource_group_name: $(resource_group_name)
          az_resource_location: $(resource_location)
          bicep_file_path: '$(bicep_filepath)'
          arm_parameters: '-parameter1name $(parameter1value) -parameter2name $(parameter2value)'

The bicep-run wraps the following:

  • Downloads and caches the Project Bicep command line. It currently references the v0.1.37 release but you can override this by specifying the bicep_download_url – make sure you provide the url to the windows executable, not the setup file.
  • Runs the Bicep command line to covert the specified .bicep file (bicep_file_path parameter) into the corresponding ARM template
  • Uses the Azure Resource Group Deployment Task to deploy the ARM template into Azure. The arm_parameters are forwarded to the overrideParameters parameter on the Azure Resource Group Deployment task.

Would love feedback on anyone that takes this template for a spin – what features would you like to see added? what limitations do you currently see for Project Bicep and the ability to run using this task?

Note: The bicep-run template is designed to run on a windows image.

Add Build 2020 Schedule to Your Calendar

Last week Microsoft released the schedule for sessions for Build 2020. However, other than via the portal, there’s no easy way to get a list of sessions so that you can easily work out which sessions work for your time-zone and which sessions you want to attend. In order to solve my own dilemma of … Continue reading “Add Build 2020 Schedule to Your Calendar”

Last week Microsoft released the schedule for sessions for Build 2020. However, other than via the portal, there’s no easy way to get a list of sessions so that you can easily work out which sessions work for your time-zone and which sessions you want to attend. In order to solve my own dilemma of how to add the sessions to my calendar I built a quick service that would generate the appropriate ICS files. Actually my first option was a single ICS with all sessions, hence there are two options:

All Build 2020 Sessions in a Single ICS file

All Build 2020 Sessions as Individual ICS files

Here’s how the whole thing went down – mid-last week I was wondering whether there was an easy way to add the schedule to my Outlook calendar. At this point the only schedule that had been announced was a high level list of just some of the sessions eg:

Given there’s not that many sessions, it would have been quick enough for me to add them individually to Outlook. But no, the developer in me went hunting for how to automate this. Opening Fiddler I saw that there was indeed an API that was being called at endpoints beginning with https://api.mybuild.microsoft.com. However, at this point there weren’t any calls to retrieve sessions, speakers etc, the only calls were to the /chrome and /settings endpoints which didn’t really reveal much.

Thinking that I wouldn’t be able to get much further I tried a last ditch effort of searching for other uses on the web for api.mybuild.microsoft.com. I guessed there wouldn’t be any direct hits, other than for the mybuild.microsoft.com portal, but I figured that given the way that search engines look for similar matches, something might turn up. Sure enough, the api url is not to dissimilar to the one used for prior events eg https://api.mybuild.techcommunity.microsoft.com. This lead me to a pretty sophisticated PowerShell script that Michel de Rooij had put together for scraping session content for prior events.

Knowing that Microsoft typically uses the same vendor for things to make it easier year-on-year, it’s no surprising that the api described by Michel’s PowerShell script was again being used for Build 2020.

I quickly used Json2Csharp QuickType (I can never remember this name but the service is great!) to generate the classes required to interact with the API. I can’t say that I engineered the ASP.NET Core 3.1 Web API service to be particularly efficient, mainly because I predicted that I was going to put both Azure CDN and CloudFlare in front of it. It basically pulls requests the entire list of sessions and holds them in a static variable. The only caching being that if the service is requested again whilst the static variable hasn’t been collected, it will return the previously retrieved list of sessions (do NOT follow this architecture for a production application).

I used iCal.NET to generate the ICS files – initially one monster ICS which has all the sessions but then I realised that it might be better to have individual ICS files that were downloaded as a Zip file. The ICS file(s) also contain session speakers (with links to their profile and/or their company, where available) and a link to the session for registering and attending.

Next I published the ASP.NET Core service to a new Azure App Service. As this was a personal project I opted for a free Azure App Service Plan. Of course, this will not scale well, since it’s really designed for development and testing purposes. However, given that I’m generating ICS files that aren’t likely to change frequently, I figure I’m going to protect my service so that it doesn’t get slammed.

The protection comes in two forms:

  • Firstly, I added an Azure CDN endpoint in front of the app service – this will cache the output of my service (level 1)
  • Secondly, I added Cloud Flare for two reasons. This gives me a nice friendly url (ie build2020.builttoroam.com) that I can share, without exposing either my Azure CDN endpoint or my app service url. I also enabled the proxying option which means that CloudFlare also caches the output of my service (level 2).

The upshot is that even if a massive number of people download the ICS file(s), my service shouldn’t even need to wake up. This is reflected by the analytics from my Azure App Service that shows almost zero out-bound data over the last 24 hours.

Hopefully the ICS files (Single ICS file or Individual ICS files) helps you build your schedule for Build 2020. Look forward to seeing everyone there, virtually of course!

Pipeline Template: Applying a Launch Icon Badge to Identify Environments and Versions of your App

A question was raised this week by Sturla as to how to incorporate the Launch Icon Badge extension into the build process when making use of the templates from Pipeline Templates (Damien covers how to use this extension in a Azure DevOps pipeline in his post on the topic). By the way, a big thank … Continue reading “Pipeline Template: Applying a Launch Icon Badge to Identify Environments and Versions of your App”

A question was raised this week by Sturla as to how to incorporate the Launch Icon Badge extension into the build process when making use of the templates from Pipeline Templates (Damien covers how to use this extension in a Azure DevOps pipeline in his post on the topic). By the way, a big thank you to Sturla for testing each release of the templates and providing invaluable feedback along the way.

The purpose of the Launch Icon Badge extension is to make it really easy to add a banner to your application icon (or any other image) to indicate that your application is prerelease. As you’ll see from the yaml in the example below there are a lot of attributes you can control, meaning that you can use this for whatever you want to signify. For example you may want to signify what environment the app is targeting by changing the background colour on the banner.

To get started with the Launch Icon Badge extension you need to make sure you install the extension to your Azure DevOps Organisation. This is important, otherwise any attempt to use the LaunchIconBadge task will fail.

If you’re using one of the build templates from Pipeline Templates you simply need to extend it by adding the LaunchIconBadge task to the preBuild task list. This is shown at the end of the following YAML snippet.

stages:
- template:  azure/mobile/[email protected]_templates
  parameters:
    # Stage name and whether it's enabled
    stage_name: 'Build_Android'
    build_android_enabled: ${{variables.android_enabled}}
    # Version information
    full_version_number: '$(version_prefix).$(Build.BuildId)'
    # Signing information
    secure_file_keystore_filename: '$(android_keystore_filename)'
    keystore_alias: '$(android_keystore_alias)'
    keystore_password: '$(android_keystore_password)'
    # Solution to build
    solution_filename: $(solution_file)
    solution_build_configuration: $(solution_build_config)
    # Output information
    artifact_folder: $(artifact_android_folder)
    application_package: $(android_application_package)
    preBuild:
      - task: [email protected]1
        inputs:
          sourceFolder: '$(Build.SourcesDirectory)/src/Apps/DotNet/Uno/InspectorUno/InspectorUno/InspectorUno.Droid' # Optional. Default is: $(Build.SourcesDirectory)
          contents: '**/*.png' # Optional. Default is:  '**/*.png'
          bannerVersionNamePosition: 'bottomRight' # Options: topLeft, topRight, bottomRight, bottomLeft. Default is: 'bottomRight'
          bannerVersionNameText: 'Prerelease'  # Optional. Default is: ''
          bannerVersionNameColor: '#C5000D' # Optional. Default is: '#C5000D'
          bannerVersionNameTextColor: '#FFFFFF' # Optional. Default is: '#FFFFFF'
          bannerVersionNumberPosition: 'top' # Optional. top, bottom, none. Default is: 'none'
          bannerVersionNumberText: '$(version_prefix).$(Build.BuildId)' # Optional. Default is: ''
          bannerVersionNumberColor: '#34424F' # Optional. Default is: '#34424F'
          bannerVersionNumberTextColor: '#FFFFFF' # Optional. Default is: '#FFFFFF' 

Important Note: In v0.5.2 of the Pipeline Templates you can no longer pass in a variable to the build_android_enabled parameter using the $(variablename) syntax. As shown here you need to use the ${{variables.variablename}} syntax to ensure it’s resolved as part of inflating the templates.

When you’re attempting to get the LaunchIconBadge task to work for you, make sure that your sourceFolder and contents are configured to locate the app icons for your application. This took me a couple of iterations as I got the path wrong the first couple of times and it resulted in no images being found – check the build log for full information on what the task is doing.

Once you have every configured, your application should be built with the appropriate banner across the application icon. I configured this for the Android, iOS and Windows build for my application and this is what I now see in App Center.

Pipeline Templates v0.5.2

Here’s a summary of what’s in release v0.5.2:

Breaking Changes:

  • build-xamarin-[iOS/android/windows].yml and deploy-appcenter.yml – XXX_enabled parameter (eg windows_enabled or deploy_enabled) no longer supports passing a variable in using $(variablename) syntax. Make sure you use ${{variables.variablename}} syntax. This also means that only variables defined in the YAML file are supported – NOT variables defined in variable groups or in the UI for the pipeline. Going forward it’s recommended to use runtime parameters for getting user input when invoking a pipeline.

Other Changes:

  • none

Pipeline Templates: How to use a file for release notes?

When deploying a release to AppCenter you can specify release notes that get presented to the user when they go to download a new release. This week a question was asked as to how to specify release notes from a file when submitting a new app version to AppCenter. If you looked at the complete … Continue reading “Pipeline Templates: How to use a file for release notes?”

When deploying a release to AppCenter you can specify release notes that get presented to the user when they go to download a new release. This week a question was asked as to how to specify release notes from a file when submitting a new app version to AppCenter.

If you looked at the complete example I provided for building and deploying a Uno app to Android, iOS and Windows, you might be wondering how you can provide release notes at all since we didn’t specify any release notes when we used the AppCenter deploy template. However, if you take a look at the parameters list you can see that there is an appcenter_release_notes parameter which has a default value set. Use this parameter if you want to specify the release notes inline when invoking the template.

In the v0.5.1 release we added two new parameters that allow you to specify a file for release notes to be taken from: appcenter_release_notes_option and appcenter_release_notes_file. To specify a file for release notes, you first need to set the appcenter_release_notes_option parameter to file. Then you need to use the appcenter_release_notes_file parameter to specify the file that you want to use.

In theory this sounds easy: you add a release notes file to your repository and then simply provide a relative path to the file in the appcenter_release_notes_file parameter. This you will find does not work!!! The AppCenter deploy template does not know anything about your source code repository, since it’s designed to deploy the artifacts from a prior build stage.

Ok, so then the question is, how do we make the release notes file available to the AppCenter template? Well we pretty much answered that in the previous paragraph – you need to deploy it as an artifact from the build process.

Assuming you’re using one of the build templates from https://pipelinetemplates.com, you can easily do this by adding additional steps as part of the prePublish extension point. Here’s an example:

- template:  azure/mobile/[email protected]_templates
  parameters:
    # Stage name and whether it's enabled
    stage_name: 'Build_iOS' 
    build_ios_enabled: $(ios_enabled)
    # Version information
    full_version_number: '$(version_prefix).$(Build.BuildId)'
    # Solution to build
    solution_filename: $(solution_file)
    solution_build_configuration: $(solution_build_config)
    # Signing information
    ios_plist_filename: 'src/Apps/DotNet/Uno/InspectorUno/InspectorUno/InspectorUno.iOS/Info.plist'
    ios_cert_password: '$(ios_signing_certificate_password)'
    ios_cert_securefiles_filename: '$(ios_signing_certificate_securefiles_filename)'
    ios_provisioning_profile_securefiles_filename: '$(ios_provisioning_profile_securefiles_filename)'
    # Output information
    artifact_folder: $(artifact_ios_folder)
    application_package: $(ios_application_package)
    prePublish:
      - task: [email protected]2
        displayName: 'Copying release notes'
        inputs:
          contents: 'src/Apps/DotNet/releasenotes.txt'
          targetFolder: '$(build.artifactStagingDirectory)/$(artifact_ios_folder)'
          flattenFolders: true
          overWrite: true

- template:  azure/mobile/[email protected]_templates
  parameters:
    # Stage name and dependencies
    stage_name: 'Deploy_iOS'
    depends_on: 'Build_iOS'
    deploy_appcenter_enabled: $(ios_enabled)
    environment_name: $(appcenter_environment)
    # Build artifacts
    artifact_folder: $(artifact_ios_folder)
    application_package: $(ios_application_package)
    # Deployment to AppCenter
    appcenter_service_connection: $(appcenter_service_connection)
    appcenter_organisation: $(appcenter_organisation)
    appcenter_applicationid: $(appcenter_ios_appid)
    appcenter_distribution_group_ids: '5174f212-ea8c-4df1-b159-391200d7af5f'
    appcenter_release_notes_option: file
    appcenter_release_notes_file: 'releasenotes.txt'

Note that the CopyFiles task copies the release notes into the artifact_folder sub-folder of the staging directory. This is important because the AppCenter deploy template will look in the sub-folder for the release notes file specified using the appcenter_release_notes_file parameter.

Pipeline Templates: Stage Dependency Fix and Improved Docs

We just released v0.5.1 of the Pipeline Templates – templates for creating build pipelines for Azure DevOps. This was a minor increment but fixed an issue that emerged due to a change in the validation of pipelines by Azure DevOps. Before we get into the details of what is included in v0.5.1, the other big … Continue reading “Pipeline Templates: Stage Dependency Fix and Improved Docs”

We just released v0.5.1 of the Pipeline Templates – templates for creating build pipelines for Azure DevOps. This was a minor increment but fixed an issue that emerged due to a change in the validation of pipelines by Azure DevOps.

Before we get into the details of what is included in v0.5.1, the other big news is that Pipeline Templates has a new documentation site.

http://pipelinetemplates.com

This site uses GitHub Pages and as such is generated from the markdown files in the project itself. Currently the site has a getting started guide, followed by a summary of each of the templates. Hopefully over time this can grow as we include more examples and extend the list of templates.

Pipeline Templates v0.5.1

Here’s a summary of what’s in this release:

Breaking Changes:

  • None

Other Changes:

  • Fix: build-xamarin-[iOS/android/windows].yml and deploy-appcenter.yml – changed depends_on parameter from stageList to string to fix validation issue introduced by Azure DevOps
  • Added NuGetInstallAndRestore.yml based on a template suggest by Damien Aicheh in his post on how to Add nightly builds to your Xamarin applications using Azure DevOps
  • Updated deploy-appcenter.yml to include additional parameters for controlling how apps are deployed via AppCenter. For example you can now specify release notes as a file, and you can flag whether a release is a mandatory update or not.
  • Added Markdown files that document each of the templates. Go to https://pipelinetemplates.com to see the rendered output.

Publishing ASP.NET Core 3 Web API to Azure App Service with Http/2

Publishing ASP.NET Core 3 Web API to Azure App Service with Http/2

In my previous post I was testing a new ASP.NET Core 3 Web API that I’d created that simply returns header and http information about the request. Having got everything working locally I decided that I should push it into an Azure App Service to make it accessible from anywhere (this seemed to be easier than attempting to connect to my locally running service from a Xamarin.Forms application). Here’s the process:

Right-click on the ASP.NET Core project and select Publish.

image

In this case we’re going to select App Service (ie a Windows host) and Create New, followed by the Publish button. Next we need to give the new App Service a name and specify both a Resource Group and an App Service Plan – in this case I’m going to create all of these as part of the publishing process

image

Hitting Create will firstly create the necessary Azure resources and then it will proceed with publishing the ASP.NET Core project into the App Service. Unfortunately, once this process has finished you’ll see that the launched url doesn’t load correctly:

image

And secondly, when you return to Visual Studio you’ll see a warning prompt indicating that ASP.NET Core 3 isn’t supported in Azure App Service right now.

image

Luckily Microsoft documentation has you covered. If you go to the main documentation on publishing to Azure App Service there is a link of to deploying preview versions of ASP.NET Core applications. This document covers two different ways to fix this issue – you can either install the preview site extensions for ASP.NET Core 3, or you can simply change your deployment to be a self-contained application. In this case we’re going to go with deploying a self-contained application, since this reduces any external dependencies which seems sensible to me.

After returning to Visual Studio and dismissing the above version warning, you’ll see the Publish properties page with the default publish configuration (you can get back to this page by right-clicking your ASP.NET Core project and selecting Publish in the future).

image

We’re going to click the pencil icon along side any of the summary properties to launch the Publish dialog and change the Deployment Mode to Self-Contained, and the Target Runtime to win-x86. You may be tempted to select win-x64 but only do this if the Platform setting on your App Service is set to 64 Bit, otherwise your service won’t start and you’ll see a 503 service error.

image

Click Save and then the Publish button to publish the application using the updated publishing properties. Note that if you’re on a network that has a slow uplink (eg ADSL) this might take a while, so you might consider jumping on a fast network (eg 4G mobile) to do the upload (and yes, this does make Australia sound like an under-developed nation when it comes to access to the internet – sigh!).

Without any further messing around, the ASP.NET Core application launches correctly:

image

Hmmm, but wait, shouldn’t it be reporting HTTP/2, after all that’s what the browser was reporting when I ran the same service on Kestrel. There’s a couple of pieces to this answer but before we do, I want to remove any element on confusion as to what’s going on here by switching across to using curl – this way we have both control over what protocol we’re requesting but also detailed logging on what protocol is being used (you’ll see why this is important in a minute). Executing the following:

curl https://headerhelper.azurewebsites.net/api/values -v

image

As you can see from the image, the response was indeed done over Http/1.1, which is consistent with the Http protocol listed by the service. Ok, so let’s try requesting Http/2

curl https://headerhelper.azurewebsites.net/api/values –http2 –v

image

This call is successful but again returns Http/1.1 – this is because curl is attempting to request an upgrade to http/2 but the service isn’t willing/able to upgrade the connection.

curl https://headerhelper.azurewebsites.net/api/values –http2-prior-knowledge -v

image

This call fails because curl is forcing the use of Http/2 when in fact the service isn’t able to communicate using Http/2. So, how do we fix this? Well the good news is that Azure App Service has a simple configuration setting that can be used to enable Http/2. Here I’m just setting the HTTP version in the Configuration page for the Azure App Service.

image

This can also be set via the resource explorer, as covered by a number of other people (eg this post). After making your change, don’t forget to Save changes and then give the service 30-60seconds for it to be restarted – if you attempt to request the service immediately you’ll still get Http/1.1 responses.

After the change has been applied, here’s what we see when we use the same curl commands as above:

curl https://headerhelper.azurewebsites.net/api/values –http2 –v

curl https://headerhelper.azurewebsites.net/api/values –http2-prior-knowledge –v

image

Note that it doesn’t matter whether we attempt to negotiate the http/2 upgrade (–http2 flag) or force the point (–http2-prior-knowledge), in both cases the connection reports HTTP/2. However, what’s not cool is that the Http protocol returned by the service is HTTP/1.1 – this is what is seen by the ASP.NET Core Web API.

What we’re seeing here is that Azure is terminating the Http/2 connection and then communicating to the underlying ASP.NET Core application using Http/1.1. This is consistent with the way that SSL support is done – Azure terminates the SSL connection, meaning that your ASP.NET Core application doesn’t need to worry about fronting a secure service. This is awesome for developers that want to add SSL or HTTP/2 to their existing services – you just enable the option in the configuration page of your App Service. However, the down side is that it makes leveraging some of the underlying capabilities of HTTP/2 impossible – for example, it’s currently impossible to host a GRPC service in an App Service as this relies on HTTP/2 to function.

The question still remains – when I request the service from the browser, what protocol is being used? The response returns HTTP/1.1 because that’s what our ASP.NET Core application sees. However, if we look at the browser debugging tools, we can see that the response is indeed being handled over a HTTP/2 connection. Note that this isn’t exposed in the UI of the debugging tools but if you save the request you can see the full details:

image

Opening the HAR file in VS Code:

image

And there you have it – deploying an ASP.NET Core 3 application to Azure App Service and exposing it using HTTP/2.

Deploying ASP.NET Core 3 to Linux Azure App Service with Docker

Deploying ASP.NET Core 3 to Linux Azure App Service with Docker

In my earlier post I covered creating and debugging an ASP.NET Core service using Docker Desktop. I’m going to build on that and look at how you then push the service into an Azure App Service. Normally I’d simply use the publish option that would allow me to push the service directly into an Azure App Service – this would run the service in much the same way as it runs locally if I was debugging on IISExpress. However, since I’m debugging via a docker container I figured it’d be great to push to Azure in a way that it continues to run in a container. Actually the reason I explored this in the first place was that I’ve been experimenting with GRPC and currently this doesn’t seem to be able to be supported on a regular Azure App Service. I figured if I could run my code in a container there would be less restrictions so I would be able to get GRPC to work (this is work in progress still).

What I did notice in Visual Studio is that when I right-clicked on my ASP.NET Core project and selected Publish, one of the options I saw was to Create new App Service for Containers.

image

Clicking Publish started the wizard to allow me to create the appropriate resources in Azure.

image

Clicking Create will trigger the creation of the selected Azure resources and then proceed to publish the application.

Note: My initial attempt to published failed with an error

“The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect. This error may also indicate that the docker daemon is not running.”

Turns out I didn’t have Docker Desktop running (I have so many apps like Slack, Teams, Skype etc that run in background that I force quit most of them each day to try to retain some semblance of a reasonable battery life on my laptop).

Once I realised that I needed to start Docker Desktop, the publish process kicked off and I saw the Docker deployment console appear with quite a detailed breakdown of the upload status – seriously why can’t Visual Studio’s build progress window be this useful. I really need to hand it to the Docker team as their uploading was super resilient. I started off uploading off our standard wifi connection which is based on an ADSL connection, so minimal upload bandwidth. I got impatient so switched mid-upload across to my mobile hotspot – after a second or two delay, the upload retried and continued without missing a beat.

image

Once publishing has completed, the Azure App Service should be all setup and ready to go with your code already published. You can use the Site URL in the publish information pane to launch the service for testing. Since my application was a web api, I’ve appended the /api/values so that I get a valid response from the Get request.

image

One of the thing that continue to amaze me about Visual Studio is the ability to create and publish new projects to Azure. Of course, for production apps, you wouldn’t follow this process at all but it does make spinning up end to end prototype applications a walk in the park.

Deploying Uno Wasm using Blob Storage

Deploying Uno Wasm using Blob Storage

Earlier today I posted about deploying Uno on Wasm to an Azure App Service (to which the Uno team replied on Twitter with an updated web.config). I was thinking a bit more about how I would deploy a real Uno Wasm app and I realised that of course, there’s no server side logic, so I could just go host it off Blob Storage.

I ran up the Azure Storage Explorer and created a new container within an existing Blob Storage account

image

As I’m going to be serving up content directly from Blob Storage I need to make sure that I enable public read access on the individual blobs (the default is that there’s no public read access on the container or blobs). Right-click on the container and select Set Public Access Level

image

Set access to Public read access for blobs only

image

Now you can simply copy the Uno Wasm application into the container by dragging the files from File Explorer into the right pane of the Azure Storage Explorer. I found that the best folder to copy is the folder that Visual Studio uses to deploy files when you do a Publish. For my project this is found at FirstUnoProject.WasmobjReleasenetstandard2.0PubTmpOutFirstUnoProject.WasmdistbinReleasenetstandard2.0dist.

And that’s it – now I can run my Uno Wasm application by clicking the Copy URL from the tool bar and then switching to my browser and launching the corresponding URL.

image

The Uno Wasm application runs without any further configuration required.

image

Publishing Uno WebAssembly (Wasm) to Azure App Service

Publishing Uno WebAssembly (Wasm) to Azure App Service

I figured that since I had Uno working in WebAssembly locally on my machine that I’d try publishing it out to an Azure App Service. I mean, how hard could it be since Visual Studio recognises that the Wasm project is a web project (nice job there team Uno!) and even gives me the option to Publish.

image

Stepping through the publish wizard I decided to create a new Azure App Service

image

I made use of an existing Resource Group and App Service Plan that I had lying around.

image

After hitting Create and publish Visual Studio went off thinking for what seemed like a long time with nothing happening. I knew it was probably busy packaging and deploying but I didn’t see anything appear in the Output window…… not surprisingly because Visual Studio pushes all the logging for the publish operation to the Web Publish Activity window.

image

Once it was done Visual Studio launches a browser window displaying the root of the newly created App Service. Unfortunately, this is not my Uno project.

image

After investigating a little I realised that the publish operation was uploading the Wasm project to a sub folder of the wwwroot folder within the App Service (eg  wwwrootFirstUnoProject.WasmbinReleasenetstandard2.0distindex.html). I validated this by using the Advanced Tools from the Azure portal.

image

From the Advanced Tools, select Files

image

Browsing the files you can locate the index.html that gets published with the Wasm project. This is the file that hosts the wasm

image

Unfortunately just adding the appropriate path to the index.html to the site url doesn’t seem to work (ie this doesn’t work: https://firstunoproject.azurewebsites.net/FirstUnoProject.Wasm/dist/bin/Release/netstandard2.0/dist/index.html). However, you can easily set up a new application to point to the dist folder. Go to Application Settings and under the section Virtual  applications and directories, create a new mapping. In this case I’ve mapped /uno to the sitewwwrootFirstUnoProject.WasmdistbinReleasenetstandard2.0dist folder (you can get the folder from the “path” shown in the Kudo – Files explorer) and I’ve made it an Application.

image

If you now attempt to go to the index.html page in your new mapped folder (eg https://firstunoproject.azurewebsites.net/uno/index.html) you’ll find that you’ll see the “Loading…” text that comes with the Uno project template but your application won’t load. If you spin up the Chrome debugging tool you’ll see that it’s not able to find the mono.wasm file with a 404 being raised. Don’t bother trying to work out whether the file exists or not because the issue is that whilst the file exists, the Azure App Service isn’t going to serve it because it’s not a known file type. Luckily there’s a simple solution. Add the following Web.config to your Wasm project and publish your application again.

<?xml version=”1.0″ encoding=”UTF-8″?>
<configuration>
   <system.webServer>
     <staticContent>
       <remove fileExtension=”.clr” />
       <remove fileExtension=”.dll” />
       <remove fileExtension=”.json” />
       <remove fileExtension=”.wasm” />
       <remove fileExtension=”.woff” />
       <remove fileExtension=”.woff2″ />
       <mimeMap fileExtension=”.dll” mimeType=”application/octet-stream” />
       <mimeMap fileExtension=”.clr” mimeType=”application/octet-stream” />
       <mimeMap fileExtension=”.json” mimeType=”application/json” />
       <mimeMap fileExtension=”.wasm” mimeType=”application/wasm” />
       <mimeMap fileExtension=”.woff” mimeType=”application/font-woff” />
       <mimeMap fileExtension=”.woff2″ mimeType=”application/font-woff” />
     </staticContent>
     <httpCompression>
       <dynamicTypes>
         <add mimeType=”application/octet-stream” enabled=”true” />
         <add mimeType=”application/wasm” enabled=”true” />
       </dynamicTypes>
     </httpCompression>
   </system.webServer>
</configuration>

Now you should be able to launch your Uno wasm-based application hosted in Azure App Service

image

Connecting Flutter to Azure at MS Ignite | The Tour – Sydney

Flutter at MS Ignite | The Tour – Sydney

This year I was lucky enough to be able to present at Microsoft’s Ignite | The Tour, which has just concluded here in Sydney. What amazed me was that I got to present on a fledgling technology recently released by Google, Flutter. Of course, this by itself wouldn’t have been sanctioned at a Microsoft event, so the important part was that I was presenting on connecting Flutter to Azure, and highlighting just how powerful the Azure options are for mobile app developers on any platform.

Using Flutter to develop cloud enabled mobile applications

Flutter is one of the newest cross-platform mobile application development frameworks and brings with it the ability to generate high-fidelity applications that look amazing on every device. This session begins with a very brief overview of Flutter, covering the tools and resources required to get started. The remainder of the session connects a Flutter application to key Azure services such as Identity and App Center. The key takeaway from this session is how Azure accelerates the development of mobile applications irrespective of what technology or framework the app is developed with.

In this post I’m going to walk through the demo portion of the session, which revolved around the creation of a very simple task style application.

Getting Started – New Project

Let’s get started by creating a new project in VS Code (If you haven’t already got the Flutter SDK installed, go to https://flutter.io and follow the Get Started button in top right corner). In VS Code in you press Ctrl+Shift+P and type Flutter you’ll see all the available options.

image

Select “Flutter: New Project” and you’ll be prompted to give your project a name – I was short on inspiration for a name for the project, hence “World’s Best Flutter App”. Note that the name of your project needs to be all lowercase, no spaces or special characters.

image

You’ll need to select a folder to put your new project in, and then VS Code will generate a default application that consists of a basic UI which includes a button that increments a counter on the screen. We’re going to start by removing most of this and replacing it with:

import 'package:flutter/material.dart';

void main() => runApp(MyApp());

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'World\'s Best Flutter App',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: MyHomePage(title: 'World\'s Best Flutter App'),
    );
  }
}

class MyHomePage extends StatefulWidget {
  MyHomePage({Key key, this.title}) : super(key: key);

  final String title;

  @override
  _MyHomePageState createState() => _MyHomePageState();
}

class _MyHomePageState extends State<MyHomePage> {
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: Text(widget.title)),
    );
  }
}

As with all new projects, at this point I’d highly recommend running the application to make sure everything has been correctly created and that you have a running application – there’s nothing worse than investing hours into crafting an amazing solution only to have to debug it for hours because of some issue that happened early in the process. At this point there’s not too much to the application other than the title bar

image

Basic Task List

The first piece of functionality we’re going to add is the ability to enter a tasks and see them appear in a list. To do this we need a text entry (TextField widget) and a list (ListView widget) and we’re going to present them with the text entry above the list in a vertical stack (Column widget). Before adding the widgets I’m going to add a couple of instance variables to the _MyHomePageState class (note that this is the state that goes with the MyHomePage stateful widget):

class _MyHomePageState extends State<MyHomePage> {
  //// Controller for TextField and List of tasks ////
  final TextEditingController eCtrl = new TextEditingController();
  List<String> tasks = [];

Now we can add our widgets to the body property on the Scaffold widget:

return Scaffold(
  appBar: AppBar(title: Text(widget.title)),
  ////  Body of the app is made up of a single column ////
  body: new Column(
    children: <Widget>[
      ////  TextField to capture input ////
      new TextField(
        controller: eCtrl,
        onSubmitted: (text) {
          tasks.add(text);
          eCtrl.clear();
          setState(() {});
        },
      ),

      ////  Expand ListView to remaining space ////
      new Expanded(
        child: new ListView.builder(
          itemCount: tasks.length,
          itemBuilder: (BuildContext ctxt, int index) {
            return Padding(
              padding: EdgeInsets.fromLTRB(10, 10, 5, 5),
              child: Text(tasks[index]),
            );
          },
        ),
      ),
    ],
  ),
);

A couple of things to note here:

  • It appears that I’ve added a couple of trailing commas – this is by design and lets the auto formatter know to position things on new lines. This is really important as Flutter code can get really messy if you don’t keep an eye on the layout of the code
  • Pressing Shift+Alt+F in VS Code will apply the auto formatter and makes the code much more readable – take a look at the following image to see how VS Code presents the above code
image

At this point we have a working app where a user can enter a task and it’s added to a list of tasks that is presented in the lower area of the screen. There are plenty of online tutorials etc that talk about layout and how setState works so I’m not going to cover that here. If you’re unfamiliar with Flutter/Dart code I would suggest at this point taking the time to go and understand what’s going on in the code – we’re going to build on it, so it’s important you understand how things work before we move on.

Visual Studio App Center

Before adding any more functionality to the app, I want to first take a side step to discuss the use of Visual Studio App Center which has been specifically created by Microsoft to support app developers, providing capabilities for building, testing, distributing apps, along with services such as analytics, crash reporting and push notifications. Currently support for Flutter is via a third party plugin which only provides analytics and crash reporting but we can expect this to grow as more developers take on Flutter and put pressure back onto Microsoft to provide official support, similar to what’s already provided for React Native and other cross platform technologies.

For our tasks app we’re going to make use of the analytics and crash reporting, as well as the support for distributing the application via App Center. To do this we first need to go to App Center and register two applications: one for iOS and one for Android. Got to either https://mobile.azure.com or https://appcenter.ms and sign in using your preferred credentials – my preference is to use my Office 365 (ie Azure Active Directory) account as this plays nicely into the way I sign into the rest of Azure and makes things like billing etc really easy. After creating your account, if you locate the apps list, then in the top right corner you should see an Add new button, which gives you the option to Add new app

image

In the Add new app flyout, you need to provide an App name and provide information about the OS and Platform – these are more important if you’re going to use App Center to build the application but is also relevant to the distribution process, so it’s important that you set these correctly (Note: Adam Pedley has a great walk-through of using App Center for build and distribution of Flutter apps). Since the App name needs to be unique within your App Center instance, it makes sense to use some sort of platform identifier as part of the name so it’s easy to identify the iOS and Android apps in the list.

image

After creating the app, you’ll be presented with some platforms specific getting started documentation. Most of this can be ignored as it’s not relevant to our Flutter application. However, you do need to extract the id that’s been assigned to your application. This can be found as part of the code samples eg:

image

After creating the app registrations in App Center, we now need to add in the Flutter plugin into our application and initialize it with the app id. Note that because App Center issues a different app id for each platform, you need to include some conditional platform logic to determine which app id to use. There are three plugins you need to add to your pubspec.yaml file under the dependencies section:

All packages are based on the source code at https://github.com/aloisdeniel/flutter_plugin_appcenter developed by Aloïs Deniel and Damien Aicheh

dependencies:
  flutter:
    sdk: flutter
  appcenter_analytics: ^0.2.1
  appcenter_crashes: ^0.2.1
  appcenter: ^0.2.1

After adding in the packages, we also need to import the relevant types to the top of our main.dart file:

////  Imports for platform info ////
import 'package:flutter/foundation.dart' show defaultTargetPlatform;
import 'package:flutter/foundation.dart' show TargetPlatform;

////  Imports for App Center ////
import 'package:appcenter/appcenter.dart';
import 'package:appcenter_analytics/appcenter_analytics.dart';
import 'package:appcenter_crashes/appcenter_crashes.dart';

Next we need to initialise the App Center plugin but adding code to the _MyHomePageState class

class _MyHomePageState extends State<MyHomePage> {
  ////  Identifier used for initialising App Center ////
  String _appCenterIdentifier = defaultTargetPlatform == TargetPlatform.iOS
      ? "3037d80f-XXXXXXXXXXX-adb968c67880"
      : "f6737675-XXXXXXXXXXX-be38b29b0f89";

And then a call to the AppCenter.start method when the class initialises – we’re overriding the initState method to call this logic when the instance of the _MyHomePageState is first created:

////  Override Initialise State ////
@override
void initState() {
  super.initState();
  initAppCenter();
}

////  Initialise AppCenter ////
void initAppCenter() async {
// Initialise AppCenter to allow for event tracking and crash reporting
  await AppCenter.start(
      _appCenterIdentifier, [AppCenterAnalytics.id, AppCenterCrashes.id]);
}

The last thing we’re going to do is to add a call to the trackEvent method for when an item is added to the tasks list. For this we’re going to modify the onSubmitted callback on the TextField

onSubmitted: (text) {
  ////  Tracking event ////
  AppCenterAnalytics.trackEvent("Adding item", {"List Item": text});

Now when you launch the app, and start adding tasks you’ll see some tracking of app related activity appear within the Analytics tab of App Center:

Note: If you get errors at this point, you may need to stop and restart your application to make sure the newly added packages are correctly built and deployed as part of your application. Hot reload doesn’t always handle this well.

image

You can see information about events raised by the app

image

Including the ability to drill down into the event and see more information, in this case you can see the List item that were added to the task list – note that this is an application specific key-value combination specified as part of the call to trackEvent.

image

 

Build and Release Automation

Before we can start distributing our app via App Center we typically setup a build and release (aka a DevOps) pipeline for our application – at Built to Roam this is one of the first things we encourage all our clients to do as it significantly cuts down on the manual steps involved, resulting in fewer errors and less wasted time. For this, we rely on Azure DevOps, and again a big shout out to Aloïs Deniel who’s created an extension for Azure DevOps – https://marketplace.visualstudio.com/items?itemName=aloisdeniel.flutter

image

After adding this extension to your Azure DevOps tenant, you can easily setup a build and release pipeline – there are a few steps involved in this, so I’ll keep this for a follow up post. The build steps involve tasks for Installing Flutter, Building the app with Flutter, Copying the generated app to the artefacts folder and the Publishing the artefacts. The release steps simply involves uploading the generated app to App Center. You can of course adjust these to add functionality and perhaps target different environments for testing, and of course uploading to the relevant stores directly from your release pipeline.

Update 22/9/2019: There are two other extensions for Azure DevOps for working with Flutter. The first simply makes it easier to install the Flutter SDK. The second is an update to the extension Aloïs created that adds AAB support – if you want to use this extension, make sure you uninstall the original extension Aloïs created because the two conflict and appear as a duplicate.

Once you have your build and release pipelines in place, you should start to see your releases appear in the Distribute / Releases tab in App Center

image

If you open App Center on your iOS or Android device you can install each release for testing.

Azure App Service

Right now we actually have a pretty useless app since all it does is record tasks in memory. When you restart the app, your list of tasks will be gone. Plus there’s no way to have your list of tasks appear on another device etc. We’re not going to complete the entire app in the blog post but what we do want to do is show how easily you can connect a Flutter app to some backend services hosted in Azure.

First up, let’s take a look at the backend service we’re going to connect to. In the demo I used an ASP.NET Core application that was published to an Azure App Service but I could just have easily used a Functions app. The application consistent of a single controller that currently only supports a Get operation to retrieve a list of tasks, which for the moment are hardcoded.

[Route("api/[controller]")]
[ApiController]
public class TasksController : ControllerBase
{
    // GET api/values
    [HttpGet]
    public ActionResult<IEnumerable<Task>> Get()
    {
        return new[]
        {
            new Task {Title = "Task 1", Completed = true, },
            new Task {Title = "Task 2", Completed = false,},
            new Task {Title = "Task 3", Completed = true, },
            new Task {Title = "Task 4", Completed = false,},
            new Task {Title = "Task 5", Completed = true, },
        };
    }
}

public class Task
{
    public string Title { get; set; }
    public bool Completed { get; set; }
}

 Swagger

It’s also worth noting that in the Startup.cs of this application I added in the logic to generate Swagger information for the service (You need to add a NuGet reference to Swashbuckle.AspNetCore.Swagger, Swashbuckle.AspNetCore.SwaggerGen and Swashbuckle.AspNetCore.SwaggerUi):

public class Startup
{
    …
    public void ConfigureServices(IServiceCollection services)
    {
        …
        services.AddSwaggerGen(c =>
            {
                c.SwaggerDoc("v1", new Info { Title = "Simple ToDo", Version = "v1" });
            });
        …
    }

    public void Configure(IApplicationBuilder app, IHostingEnvironment env)
    {
        …
        app.UseSwagger();
        app.UseSwaggerUI(c =>
            {
                c.SwaggerEndpoint("/swagger/v1/swagger.json", "Simple ToDo V1");
            });
        …
    }
}

Update 12/11/2019: If you’re using .NET Core 3 preview, you’ll need to upgrade to the preview of Swashbuckle NuGet packages and change the Info class reference to OpenApiInfo. Although at time of writing, there is an issue with the Dart code generator based on the swagger generated by .NET Core 3 preview, so perhaps best to stick with .NET Core 2.2 until v3 is officially released.

This application was published to a new Azure App Service which meant that both the /api/tasks endpoint, as well as the swagger endpoints were publically available. We’ll come back to securing these in a bit. To be able to call the tasks endpoint we could have gone ahead and created a raw http request and then have to worry about decoding the returned json into a series of entities, for which we’d have to write, or generate, the decoding logic. Instead, I went over to SwaggerHub and imported the swagger endpoint:

image

After selecting “Import and Document API” I entered the url for the swagger endpoint on the Azure App Service eg https://simpletodonick.azurewebsites.net/swagger/v1/swagger.json

image

After hitting Import SwaggerHub will convert your json formatter swagger into YAML – we’re not really interested in this piece, since what we’re after is some generated Dart code. However, before we can go ahead and generate the Dart code we first have to configure the generator options for Dart. Click on the Export button in the top right of the SwaggerHub page, followed by Codegen Options.

image

Locate “dart” in the left hand tree, and then set the browserClient property to false – by default the Dart generator creates code suitable for Dart being used for web development. For Flutter you need to set this property to false.

image

Going back to the Export menu, now select “dart” from under the Client SDK option.

image

This will automatically generate and download a package that will include a strongly typed set of classes for accessing the tasks endpoint. I extract this folder and place the contents of the generated zip file into a new sub-folder called “tasks” at the root of the Flutter application.

image

After adding the files into the folder, I then needed to update the pubspec.yaml file for my application to include the package as a dependency, as well as the http package:

dependencies:
  flutter:
    sdk: flutter
  appcenter_analytics: ^0.2.1
  appcenter_crashes: ^0.2.1
  appcenter: ^0.2.1
  http: '>=0.11.1 <0.12.0'
  swagger: 
    path: tasks 

(note that “swagger” correlates to the name of the imported package which is defined with the pubspec.yaml file within the tasks folder)

Update 22/9/2019: There’s an incompatibility between a new Flutter project and the generated Dart code. To fix it you need to update the pubspec.yaml inside the tasks folder to include the same environment value as the pubspec.yaml for your application.

Now to make use of the generated code in our application, I also needed to add a imports statement to the main.dart file.

////  Import for Swagger ////
import 'package:swagger/api.dart';

As we’re going to be using the Task entity described by the Swagger I need to update the list variable from a List<string> to List<Task>. I also added the base url for the app service.

////  Change List to use Task ////
List<Task> tasks = [];

////  Base url for the Simple ToDo service ////
final _baseUrl = 'https://simpletodonick.azurewebsites.net';

At this point I also needed to fix up the existing task list functionality which expected a string, to use a Task entity. This was in two places, firstly when creating a new task I needed to create a new Task:

tasks.add(
  Task()..title = text,
);

And then when displaying the tasks in the list

child: Text(tasks[index].title),

Now with the app back up and running, we can go ahead and call the service in order to retrieve the tasks when the app launches.

void loadData() async {
  ////  Calling App Service ////

  // Setup api client with base url and token returned by auth process
  var client = ApiClient(basePath: _baseUrl);

  var tasksApi = new TasksApi(client);

  var downloadedTasks = await tasksApi.callGet();
  setState(() {
    tasks = downloadedTasks;
  });
}

The loadData method should be called from the initState method, just after calling initAppCenter. Running the app at this point should return a list of tasks on startup and then allow subsequent tasks to be added (of course at the moment those tasks aren’t saved anywhere!)

Authentication with Azure B2C

Currently there is no security applied to the app service, which is clearly not only bad practice, it also means that it is impossible to personalise the task list to the individual user, since everyone will be returned the same list of tasks. The good new is that we can add authentication to our service without writing a single line of code. To do this we’re going to choose to use Azure B2C. However, Azure App Services can also use a variety of different social credential providers to authenticate the user.

For this application I created a new instance of Azure B2C and setup the default B2C_1_signupandin user flow (aka policy). After creating the instance and the user flow, you still need to associate the Azure B2C instance with the app service and vice versa. To do this, start by creating a new application registration within the Azure B2C portal – go to Applications tab and click Add button.

image

In creating the application registration we want to enable both web app/api and native client. You’ll notice that under the Native Client area there is a Custom Redirect URI specified. To do authentication using the flutter_appauth plugin (which I’ll come to in a second) the Custom Redirect URI should be in a similar format ie <reverse domain>://oauthredirect.

After creating the application registration, take note of the Application ID – you’ll need this when we update the Authentication on the App Service that requires a Client Id.

The next thing to do, also in the Azure B2C portal is to navigate to the User flows (policies) tab and then click the Run user flow button.

image

From the Run user flow dialog, copy the link at the top (eg https://worldsbestflutterapp.b2clogin.com/worldsbestflutterapp.onmicrosoft.com/v2.0/.well-known/openid-configuration?p=B2C_1_signupandin)

Now we need to switch across to the App Service in the Azure Portal and click on the Authentication tab. Initially App Service Authentication will be set to Off and there is a warning that anonymous access is permitted.

image

Switch the App Service Authentication to On and under Authentication Providers, click through on the Azure Active Directory settings.

image

Use the Application Id captured when setting up the Azure B2C as the Client ID. Use the link recorded from the Run user flow dialog as the Issuer Url. Click OK to commit the Azure Active Directory settings.

Back in the Authentication / Authorization settings, make sure that the “Action to take when request is not authenticated” dropdown is set to “Log in with Azure Active Directory” – this will enforce sign in if no Authorization token is presented.

image

After hitting Save, if you attempt to access the app service you’ll find that you get a security exception (if sending raw requests), or redirected to the sign in prompt if attempting to load the endpoint in a browser.

 Authentication in Flutter using Flutter_AppAuth

After setting up authentication on the app service, if you attempt to run the application it won’t be able to retrieve the list of tasks as the requests are unauthenticated. In order to create a request that is authenticated (ie supplies a validate Authorization header in this case) we need to get the user to sign into the application. Luckily there is a Flutter plugin, flutter_appauth (source code), that has been created by Built to Roam colleague Michael Bui, that wraps AppAuth iOS and Android libraries.

image

There are a few steps involved in getting our application setup to use flutter_appauth. We’ll start by adding the package reference to the pubspec.yaml file:

dependencies:
  flutter:
    sdk: flutter
  flutter_appauth:
  ...

Next, we need to upgrade the Android application to use the AndroidX libraries instead of the older support libraries. To do this we need to open the Android part of the flutter application in Android Studio. Right-click on the android folder in VS Code and select the Open in Android Studio option.

image

When prompted to “Import Project from Gradle” you can accept the defaults and click Ok. The application may take a few minutes to open in Android Studio and you’ll need to allow a bit of extra time for Android Studio to finish processing the application before the Migrate to AndroidX option becomes enabled in the Refactor menu:

image

Side note: The first couple of times I tried this whilst preparing this demo Android Studio kept saying that there was nothing to update. It turns out that there was an update to the Flutter plugin to Android Studio that was waiting to be installed – I did this, which resulted in Android Studio reloading. After reloading the folder structure under Project expanded to show the referenced flutter plugins as well as any external libraries. At this point running Migrate to AndroidX worked.

When Migrate to AndroidX does work for you it’s not immediately obvious that something has changed…. which is because it hasn’t. Instead Android Studio will have opened up a tool window at the bottom of the screen, prompting you to confirm the Refactoring Preview. Click Do Refactor to complete the migration.

image

If you look in the /android/app/build.gradle file you should see that the dependencies list at the end of the file have been updated.

dependencies {
    testImplementation 'junit:junit:4.12'
    androidTestImplementation 'androidx.test:runner:1.1.0-alpha'
    androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.0-alpha'
}

At the point of writing this post the migration process references alpha versions. You can update this to reference the stable versions now:

dependencies {
    testImplementation 'junit:junit:4.12'
    androidTestImplementation 'androidx.test:runner:1.1.0'
    androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.0'
}

Whilst in the build.gradle file the compiledSdkVersion and targetSdkVersion should be increased to at least 28. Lastly, update the defaultConfig element to include manifestPlaceholder

defaultConfig {
    applicationId "com.example.worldsbestflutterapp"
    minSdkVersion 16
    targetSdkVersion 28
    ...
    manifestPlaceholders = [
            'appAuthRedirectScheme': 'com.builttoroam.simpletodo'
    ]
}

Note that the appAuthRedirectScheme should match the reverse domain part of the Custom Redirect URI property set when setting up Azure B2C earlier.

For iOS open the /ios/Runner/Info.plist file and add the CFBundleURLTypes key/value as follows.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
	...
	<key>CFBundleURLTypes</key>
	<array>
		<dict>
			<key>CFBundleTypeRole</key>
			<string>Editor</string>
			<key>CFBundleURLSchemes</key>
			<array>
				<string>com.builttoroam.simpletodo</string>
			</array>
		</dict>
	</array>
</dict>
</plist>

We’re now ready to write the code to authenticate the user within the application. Start by adding an imports to bring in flutter_appauth

////  Import for Flutter_AppAuth ////
import 'package:flutter_appauth/flutter_appauth.dart';

Next, add in some constants that are needed for the authentication process to the beginning of the _MyHomePageState class, as well as an instance of the FlutterAppAuth class.

class _MyHomePageState extends State<MyHomePage> {
  ////  Attributes required for authenticating with Azure B2C ////
  final _clientId = '7c57c3a6-bedb-41d6-9dcd-318224013a26';
  final _redirectUrl = 'com.builttoroam.simpletodo://oauthredirect';
  final _discoveryUrl =
      'https://worldsbestflutterapp.b2clogin.com/worldsbestflutterapp.onmicrosoft.com/v2.0/.well-known/openid-configuration?p=B2C_1_signupandin';
  final List<String> _scopes = [
    'openid',
    'offline_access',
  ];

  ////  Instance of FlutterAppAuth ////
  FlutterAppAuth _appAuth = FlutterAppAuth();

At the beginning of the loadData method we created earlier, add a call to the authorizeAndExchangeCode method on the _appAuth instance.

void loadData() async {
////  Authenticate with Azure B2C ////
  var result = await _appAuth.authorizeAndExchangeToken(
      AuthorizationTokenRequest(_clientId, _redirectUrl,
          discoveryUrl: _discoveryUrl, scopes: _scopes));

Now, the only thing left to do is to pass in the token that gets returned from the authentication process into the call to retrieve the tasks. To do this we actually need to make a minor change to the swagger generated code to tell it to use OAuth credentials. Locate the api_client.dart file, which for my project is at /tasks/lib/api_client.dart and change the ApiClient constructor as follows:

ApiClient({this.basePath: "https://localhost"}) {
  // Setup authentications (key: authentication name, value: authentication).
  _authentications['Bearer'] = new OAuth();
}

This sets the ApiClient to use OAuth which means that you can then call the setAccessToken method on the instance of the ApiClient, which should be the code that immediately follows the call to authorizeAndExchangeCode.

var client = ApiClient(basePath: _baseUrl);
client.setAccessToken(result.idToken);

And there you have it, your application should be good to go. When you launch the application, it will redirect to the Azure B2C sign in page. After authenticating it will redirect back to the application and load the task list.

image

Big shout out to the community for rolling so many great plugins for Flutter. The world of cross platform development is progressively becoming easier to make high quality applications through frameworks such as Xamarin.Forms and Flutter.


Contact Built to Roam for more information on building cross-platform applications


Getting Started with Xamarin.Forms and Authenticating with ADAL

Getting Started with Xamarin.Forms and Authenticating with ADAL

Previous posts in this sequence:
Getting Started with Xamarin.Forms and Multi-Targeting in Visual Studio
Getting Started with Xamarin.Forms and SDK Versions
Getting Started with Xamarin.Forms with MvvmCross

Now that we’ve got the basics of an application, we’re going to add some authentication using the Azure Active Directory Authentication Library. We’re going to start with registering an application with Azure Active Directory (AAD) (the docs do cover this here but I figured I’d cover it again anyway). Start by heading over to the Azure portal and selecting Azure Active Directory from the set of tabs on the left.

image

In the list of tabs for the AAD, find the App Registrations tab

image

Click “New application registration” and fill in the details for the new app

image

Note that the Sign-on URL doesn’t have to be a real URL, just one that both the portal and the app know about – we’ll use this later! After you’re done creating the app registration you should be able to retrieve the Application ID. Save this information for later as we’ll need it when we’re authenticating the user within our application.

image

The pieces of information you’ll need in order to use ADAL within your application are:

clientId – This is the Application ID that was displayed after you created the app registration in the Azure portal. This is a GUID

redirectUri – This is the Redirect URI that you specified when creating the app registration. If you didn’t write this down, or you want to double check, or change it, you can do so by clicking the Redirect URIs tab under Settings for the app registration. This is a URL (e.g. https://stratapark)

resourceId – This is the identifier of the resource you want to access. In this case we just want to authenticate the user, so we’re going to use the default resource, which is the AAD directory itself, https://graph.windows.net/. This can be a URL or a GUID depending on the type of resource you’re referencing

tenantId – This is the identifier of the tenant that you’re authenticating against. This can be the URL or GUID of your tenant.

To retrieve the tenantId you need to select the Properties tab of the Azure Active Directory and copy the Directory ID

imagehttps://www.nuget.org/packages/Microsoft.IdentityModel.Clients.ActiveDirectory/

Now that we’ve setup the app registration, we’re good to start writing some code. Firstly, we need to add a reference to the ADAL library from NuGet – search for Microsoft.IdentityModel.Clients.ActiveDirectory

image

Add the ADAL library to all projects.

Let’s start with the UI and add a button to the MainPage.

<Button Text=”Authenticate” Command=”{Binding AuthenticateCommand}” />

And of course we need a command within the MainViewModel. In fact we’ll add the bulk of the authentication code to the MainViewModel too:

private IMvxCommand authenticateCommand;
public IMvxCommand AuthenticateCommand => authenticateCommand ?? (authenticateCommand = new MvxAsyncCommand(Authenticate));


private const string AuthEndpoint = “https://login.microsoftonline.com/{0}/oauth2/token”;


private const string ResourceId= “https://graph.windows.net/”;

private const string TenantId = “[your tenantId]”;
private const string ClientId = “[your clientId]”;
private const string RedirectURI = “[your redirectUri]”;


public async Task Authenticate()
{
     string authority = string.Format(CultureInfo.InvariantCulture, AuthEndpoint, TenantId);
     var authContext = new AuthenticationContext(authority);


    var result = await authContext.AcquireTokenAsync(ResourceId, ClientId, new Uri(RedirectURI), PlatformParameters);
     Debug.WriteLine( result.AccessToken);
}

The only thing missing is the PlatformParameters (bolded in the code above). As the name suggests, these are platform specific parameters required by the ADAL library. For each platform we need to expose a property that returns an IPlatformParameters within the MainViewModel.Platform.cs for each platform.

Android

private IPlatformParameters PlatformParameters => new PlatformParameters(CurrentActivity, true, PromptBehavior.Auto);
public Activity CurrentActivity
{
     get
     {
         var top = Mvx.IoCProvider.Resolve<IMvxAndroidCurrentTopActivity>();
         return top.Activity;
     }
}

iOS

private IPlatformParameters PlatformParameters => new PlatformParameters(GetTopViewController(),true, PromptBehavior.Auto);
public static UIViewController GetTopViewController()
{
     var window = UIApplication.SharedApplication.KeyWindow;
     var vc = window.RootViewController;
     while (vc.PresentedViewController != null)
         vc = vc.PresentedViewController;


    if (vc is UINavigationController navController)
         vc = navController.ViewControllers.Last();


    return vc;
}

UAP

private IPlatformParameters PlatformParameters => new PlatformParameters(PromptBehavior.Always, false);

NetStandard

private IPlatformParameters PlatformParameters => null;

Note: you need to provide the NetStandard implementation otherwise the NetStandard target won’t build.

There is one more thing we need to add. Android requires an additional piece of code within the Activity that hosts the Xamarin.Forms application

protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
     base.OnActivityResult(requestCode, resultCode, data);
     AuthenticationAgentContinuationHelper.SetAuthenticationAgentContinuationEventArgs(requestCode, resultCode, data);
}

If your Android application authenticates but doesn’t ever return the access token, chances are that your OnActivityResult override is either missing, or on the wrong Activity – I spent a bit of time trying to work out why iOS and UWP worked but Android would just hang; turns out I was missing this override.

There you have it – you’re good to go with using ADAL to authenticate your users within your Xamarin.Forms application

Good-bye HipChat, and don’t let the door hit you on the way out!

Good-bye HipChat, and don’t let the door hit you on the way out!

It was interesting to see this week that Atlassian doomed the future of HipChat and its successor, Stride, with an aggressive wrap up schedule, with the services set to be discontinued on February 15, 2019. At Built to Roam, as a consulting company, we use a number of messaging tools including Messenger, Skype, Skype for Business / Microsoft Teams, Slack, HipChat and a few others. The upshot is that none of these tools do a great job of even their primary function (i.e. chat conversation between two or more parties), as I’ve posted about previously.

As a couple of posts have indicated, the messaging market has become over populated – for a while it felt like I was installing a new messaging app every second day. When Teams first came to the market, there was a lot of criticism aimed at it because it was a primitive offering in comparison to both Slack and HipChat but it’s rapid growth has started to put pressure on other players in the market. I think a rationalisation of the market was due, and I’m not sorry to see the back of HipChat. As one of the older products in the market, it never quite understood the need for users to belong to multiple organisations and to be able to switch between them.

There are some posts that are talking up the closure of HipChat/Stride as an attempt by Slack and Atlassian to team up in the fight against Microsoft Teams. So the question is, will this make a difference? Will it slow the growth of Microsoft Teams? Will it help Slack win over the corporate space?

Recently, Microsoft Teams announced a free tier, which was one of the things that held a lot of smaller companies and teams from using Microsoft Teams. This move in itself has weakened Slack’s position in the market. However, the true hook for Microsoft Teams and in my opinion the sole reason for its wide adoption (because let’s be honest, it’s far from being a great product!), is that it allows users to sign in using their Office 365 / Microsoft 365 account. In other words, if your company has made, or is moving, to Office 365, you can use your existing credentials to sign into Microsoft Teams. And of course, once you do, you can see and communicate with all the other users in your organisation. Can do you do this with Slack? The short answer is no. The long answer is yes but you need to do a bunch of stuff, including pay a ton of money for stuff that should be out of the box (seriously like what the? https://get.slack.help/hc/en-us/categories/200122103-Workspace-Administration#configure-access-security).

The ridiculous thing is that integration into Azure Active Directory (i.e. use Office 365 and Microsoft 365 credentials) is pretty straight forward. Is there something that Slack can do to get the jump on Microsoft Teams? Yes, provide out of the box support to sign in using either G-Suite or Office 365 credentials. In the future there will be two types of organisation, those that use Office 365, and those that don’t. Most of those in the latter group will probably use some form of G-Suite, so providing out of the box support for G-Suite should be on the radar of any enterprise software.

I know this post has gone on a bit but my last point is that I wish services would stop charging a premium for improving the security of their service. Integration with Azure Active Directory and G-Suite should be include in the cheapest tier of any offering. Why would you compromise the security of your service and the data of your users by not providing this.

Authentication Redirection Loop with Angular Application and Azure Active Directory

Authentication Redirection Loop with Angular Application and Azure Active Directory

Recently we ran in to some difficulty with an Angular application that was being retrofitted into a different environment. During the initial development the Angular application had been pushed to Azure for testing. However, the final resting place for the application was on a on-premises sever. Whilst the switch was relatively painless, with the only major change being to a persistent file storage instead of blob storage, we also had to shift from our development Azure AD tenant (the Angular application, and the associated services, uses Azure AD to authenticate and authorize users), to the client’s Azure AD tenant. This shift required creating two new application registrations within the client’s Azure AD tenant.

Unfortunately after creating the new registrations, and updating the Angular application (and the corresponding services), any attempt to log in with valid credentials resulted in a continual loop between the Angular application and the Azure AD login prompt. In this case we were only using Azure AD to authenticate users and other than controlling access to the application services there weren’t any other permissions that users would have to agree to.

In the past I’ve posted about how administrators have to grant permission to users within their tenant to access an application (see https://nicksnettravels.builttoroam.com/post/2017/01/24/Admin-Consent-for-Permissions-in-Azure-Active-Directory.aspx). Usually there is an Azure AD login error when users attempt to sign in. In this case, for some reason we either missed the error message or it was being obscured by the automatic redirection between the Angular application and the Azure AD login prompt.

We did end up finding the solution, thanks to the Azure AD team at Microsoft, who quickly identified the error in our Fiddler trace. The critical request/response was:

Request

GET https://login.microsoftonline.com/<tenantid>/oauth2/authorize?response_type=token&client_id=<clientid>&resource=<resourceid>&redirect_uri=<uri>&prompt=none&login_hint=admin


Response

HTTP/1.1 302 Found
Location: <uri>
<html><head><title>Object moved</title></head><body>
<h2>Object moved to <a href=”<uri>/#error=interaction_required&amp;error_description=AADSTS65001%3a+The+user+or+administrator+has+not+consented+to+use+the+application+with+ID+%27.+Send+an+interactive+authorization+request+for+this+user+and+resource

The important part is that the error indicates that either the user or administrator has not consented to use of the application, and that there should be an interactive authorization request. This is a little cryptic but going back to my previous post we can simply add “prompt=admin_consent” to the login request – assuming an administrator logs in, they can then grant access to the application to all users in the tenant.

There is actually a much easier way for single tenant applications, which this is. Instead of waiting for an administrator to log in, permission can be granted via the Azure portal:

– Select the Directory where the application is registered (drop down in top right cornert of the Azure portal where the signed in user is listed)

– Select Azure Active Directory from the sidebar menu

– Select App Registrations

– Select the application registration you want to grant access to

– From All settings, click on the Required Permissions link

– Click “Grant Permissions”

image

This will give all users access to the application. If you have multiple registrations (eg one for the web site and one for the corresponding services), don’t forget to grant permission to both registrations.

Again, big thanks to the team at Microsoft for pointing us to the solution

Call out to the ADAL team! – Authenticate Using External Browser

Call out to the ADAL team! – Authenticate Using External Browser

In my post, Authorizing Access to Resources using Azure Active Directory, I talk about authenticating using the built in browser on the device, rather than authenticating via a webview, which is all too common. Unfortunately despite being fully supported by Azure Active Directory, the team responsible for ADAL haven’t, as far as I can tell, provided support for using an external browser to authenticate.

I was super impressed when I just downloaded the Facebook app on Windows, that it supports “Log in with Browser”.

image

In my opinion, this not only represents a more secure form of authentication (since I can validate the website I’m signing into), it is also a better experience, since I’m already logged into Facebook in the browser anyhow.

I definitely encourage developers to consider using the external browser, rather than supporting SDKs and libraries that us the in-app browser.

UseWindowsAzureActiveDirectoryBearerAuthentication v’s UseJwtBearerAuthentication for Authorization with Azure Active Directory for an ASP.NET Web API

UseWindowsAzureActiveDirectoryBearerAuthentication v’s UseJwtBearerAuthentication for Authorization with Azure Active Directory for an ASP.NET Web API

In my previous post, Securing a Web API using Azure Active Directory and OWIN, I covered how to authorize requests against Azure Active Directory using the UseWindowsAzureActiveDirectoryBearerAuthentication extension method in the OWN startup class. This extension method has been designed specifically for Azure Active Directory but if you think about it, the Authorization token is just a JWT token, so in theory you could take a much more generic approach to authorizing access by validating the JWT. This can be done using the UseJwtBearerAuthentication extension method.

There are a couple of steps to using the UseJwtBearerAuthentication extension method. Firstly, in order to validate the signature of the JWT, we’re going to need the public certificate that matches the key identifier contained in the JWT. In my post on Verifying Azure Active Directory JWT Tokens I cover how to examine the JWT using https://jwt.io in order to retrieve the kid, retrieve the openid configuration, locate the jwks uri, retrieve the keys and save out the key as a certificate. In the post I used the certificate (ie wrapping the raw key in —BEGIN—, —END— markers) to validate the JWT; in this case I’ve copied the contents into a text file which I’ve named azure.cer and added it to the root of my web project (making sure the build action is set to Content so it is deployed with the website).

The next thing to do is to remove the UseWindowsAzureActiveDirectoryBearerAuthentication extension method, replacing it with the following code.

var fileName = HostingEnvironment.MapPath(“~/”) + “azure.cer”;
var cert = new X509Certificate2(fileName);
app.UseJwtBearerAuthentication(new JwtBearerAuthenticationOptions
{
    AllowedAudiences = new[] {ConfigurationManager.AppSettings[“ida:Audience”]},
    IssuerSecurityTokenProviders = new IIssuerSecurityTokenProvider[]
    {
        new X509CertificateSecurityTokenProvider(ConfigurationManager.AppSettings[“ida:IssuerName”], cert)
    }
});

This code uses the azure.cer certificate file combined with the Audience and IssuerName which I’ve added to the web.config.

<add key=”ida:Audience” value=”a07aa09e-21b9-4e86-b269-a18903b5fe54″ />
<add key=”ida:IssuerName” value=”https://sts.windows.net/55cc17b5-7d2a-418e-86a6-277c54462485/” />

The Audience is the application id (aka client id) of the Azure application registration. The IssuerName needs to match to what appears in the JWT. Opening one of the tokens in https://jwt.io it’s the ISS value that you want to use as the IssuerName.
image

Now you can run the project and see that again the requests are validated to ensure they’re correctly signed.

Securing a Web API using Azure Active Directory and OWIN

Securing a Web API using Azure Active Directory and OWIN

In this post we’re going to look at how to use Azure Active Directory to secure a web api built using ASP.NET (full framework – we’ll come back to .NET Core in a future post). To get started I’m going to create a very vanilla web project using Visual Studio 2017. At this point VS2017 is still in RC and so you’ll get slightly different behaviour than what you’ll get using the Visual Studio 2015 templates. In actual fact the VS2015 templates seem to provide more in the way of out of the box support for OWIN. I ran into issues recently when I hadn’t realised what VS2015 was adding for me behind the scenes, so in this post I’ll endeavour not to assume anything or skip any steps along the way.

image

After creating the project, the first thing I always to is to run it and make sure the project has been correctly created from the template. In the case of a web application, I also take note of the startup url, in this case http://localhost:39063/. However, at this point I also realised that I should do the rest of this post following some semblance of best practice and do everything over SSL. Luckily, recent enhancements to IIS Express makes it simple to configure and support SSL with minimal fuss. In fact, all you need to do is select the web project node and press F4 (note, going to Properties in the shortcut menu brings up the main project properties pane, which is not what you’re after) to bring up the Properties window. At the bottom of the list of properties is the SSL Enabled and SSL URL, which is https://localhost:44331/. Take note of this url as we’ll need it in a minute.

image

To setup the Web API in order to authorize requests, I’m going to create a new application registration in Azure Active Directory. This time I need to select Web app / API from the Application Type dropdown. I’ll give it a Name (that will be shown in the Azure portal and when signing into use this resource) and I’ll enter the SSL address as the Sign-on URL. This URL will also be listed as one of the redirect URIs used during the sign in process. During debugging you can opt to do this over HTTP but I would discourage this as it’s no longer required.

image

After creating the application, take note of the Application Id of the newly created application. This is often referred to as the client id and will be used when authenticating a user for access to the web api.

Application Id (aka Client Id): a07aa09e-21b9-4e86-b269-a18903b5fe54

We’re done for the moment with Azure Active Directory, let’s turn to the web application we recently created. The authorization process for in-bound requests involves extracting the Authorization header and processing the bearer token to determine if the calling party should have access to the services. In order to do this for tokens issues by Azure AD I’ll add references to both the Microsoft.Own.Security.ActiveDirectory and Microsoft.Own.Host.SystemWeb packages.

image

Note: Adding these references takes a while! Make sure they’re completely finished before attempting to continue.

Depending on the project template, you may, or may not, already have a Startup.cs file in your project. If you don’t, add a new item based on the OWIN Startup class template

image

The code for this class should be kept relatively simple:

[assembly: OwinStartup(typeof(SampleWebApp.Startup))]
namespace SampleWebApp
{
    public partial class Startup
    {
        public void Configuration(IAppBuilder app)
        {
            ConfigureAuth(app);
        }
    }
}

Additionally, you’ll want to add another partial class file Startup.Auth.cs in the App_Start folder.

namespace SampleWebApp
{
    public partial class Startup
    {
        public void ConfigureAuth(IAppBuilder app)
        {
        }
    }
}

And now we get to adding the middleware that will be used to process the authorization header

public void ConfigureAuth(IAppBuilder app)
{
    app.UseWindowsAzureActiveDirectoryBearerAuthentication(
        new WindowsAzureActiveDirectoryBearerAuthenticationOptions
        {
             Tenant = ConfigurationManager.AppSettings[“ida:Tenant”],
             TokenValidationParameters = new System.IdentityModel.Tokens.TokenValidationParameters
             {
                 ValidAudience = ConfigurationManager.AppSettings[“ida:Audience”]
             }
        });
}

This uses the configuration manager to extract the Tenant and Audience settings from the web.config (and subsequently the Azure portal settings when you publish to the cloud):

<add key=”ida:Audience” value=”a07aa09e-21b9-4e86-b269-a18903b5fe54″ />
<add key=”ida:Tenant” value=”nicksdemodir.onmicrosoft.com” />

The tenant is the Id, or in this case, the domain of the tenant where the application is registered. The Audience is the application Id of the application registered in Azure AD.

Warning: If you run the application now and get an error relating to a missing type, you may have to revert the Microsoft.Owin.Security.ActiveDirectory to the most recent v4 package. At the time of writing this post there seems to be an incompatibility between v5 and Owin.

Reference to type ‘TokenValidationParameters’ claims it is defined in ‘System.IdentityModel.Tokens.Jwt’, but it could not be found

Ok, we’re ready to try making requests. I’m going to use Fiddler but you can use any other tool that’s able to generate and send HTTP requests. The first attempt will be a GET request to https://localhost:44331/api/values which is one of the default controllers that was created from the project template. Depending on what your project template included, the valuescontroller may, or may not, have the Authorize attribute applied to it. If, like me, you didn’t have the Authorize attribute applied to the valuecontroller, you should get a valid response back to your HTTP request. In this case, you’re going to want to add security to the valuecontroller by adding the Authorize attributes:

[Authorize]
public class ValuesController : ApiController
{

Now, try making the request again – you should now get a 401 Unauthorized error. The body of the response should say:

{“Message”:”Authorization has been denied for this request.”}

Clearly this is the case since we didn’t send the Authorization header. This time, let’s add an Authorization header, with the word “Bearer” and a token consisting of random text:

Authorization: Bearer abcdefg

This should generate the same response. However, let’s start to look into this further. To get more diagnostic information, add the following to the web.config file for the project

<system.diagnostics>
  <switches>
    <add name=”Microsoft.Owin” value=”Verbose” />
  </switches>
</system.diagnostics>

Now when you make the request you should see more diagnostic information in the Output window in Visual Studio:

Microsoft.Owin.Security.OAuth.OAuthBearerAuthenticationMiddleware Error: 0 : Authentication failed
System.ArgumentException: IDX10708: ‘System.IdentityModel.Tokens.JwtSecurityTokenHandler’ cannot read this string: ‘abcdefg’.
The string needs to be in compact JSON format, which is of the form: ‘<Base64UrlEncodedHeader>.<Base64UrlEncodedPayload>.<OPTIONAL, Base64UrlEncodedSignature>’.

As we should have predicted, the token we passed in isn’t a value Jwt – it’s not even valid JSON. Let’s fix this by generating an actual access token for this Web API. In a previous post I walked through manually the process of authenticating, retrieving an authorization code and then exchanging it for an access token. I’ll do the same here.

First I’m going to launch an authorization url in the browser and sign in using credentials from the nicksdemodir.onmicrosoft.com tenant:

https://login.microsoftonline.com/nicksdemodir.onmicrosoft.com/oauth2/authorize?client_id=a07aa09e-21b9-4e86-b269-a18903b5fe54&response_type=code&redirect_uri=https://localhost:44331/

The authorization url is made up of various components:

nicksdemodir.onmicrosoft.com – This is the domain name of the tenant where the web application is registered with Azure AD. You can also use the tenant Id (guid format)

a07aa09e-21b9-4e86-b269-a18903b5fe54 – This is the application id of the application registration in Azure AD

code – This indicates that the response should be an authorization code

https://localhost:44331/  – This is the uri that the browser will be redirected back to, passing with it the code in the query string.

Make sure you have the web application running, otherwise the redirect uri won’t resolve at it may be hard to extract the code from the query string (depending on the browser). After signing in, you’ll be redirected back to your web application with a URL similar to (the code has been shortened for brevity):

https://localhost:44331/?code=zvrs_zz0…….05B_ggAA&session_state=ef2986b8-75bd-484a-b9b9-68f0e46ab569

The next thing to do is to prepare a POST request in your http tool of choice with the following:

URL: https://login.microsoftonline.com/nicksdemodir.onmicrosoft.com/oauth2/token

Body: grant_type=authorization_code&client_id=a07aa09e-21b9-4e86-b269-a18903b5fe54&client_secret=c06kP0Q9ENGpZGbiZTqB1QQaZUWNe190mCittRMr&redirect_uri=https://localhost:44331/&code=zvrs_zz0…….05B_ggAA&resource=a07aa09e-21b9-4e86-b269-a18903b5fe54

The Body parameters are broken down as:

a07aa09e-21b9-4e86-b269-a18903b5fe54 – This is the application id of the application registration in Azure AD. It’s required as both the client_id and the resource, since we’re using the access token to access the web application itself.

c06kP0Q9ENGpZGbiZTqB1QQaZUWNe190mCittRMr – This is a private key (aka client secret) issued by the Azure AD application to ensure the security of token requests. I’ll come back to this in a second and show how to create one for your application.

https://localhost:44331/ – The redirect uri for the application – required here to verify the calling party as it has to align with what’s in Azure AD

zvrs_zz0…….05B_ggAA – This is the authorization code returned in the previous step

To generate the client secret in Azure AD simply click on the Keys tab within the details of the application registration. You can then create a new key by entering a description. The description is only seen by you, so give it a name that’s meaningful to you. Note that once you save the new key, you will only be shown the value of the key once. Once you leave the page, the value of the key will never been shown again.

image

The key created in the Azure AD should be used as the client secret when doing the authorization code to access token exchange.

The response to this POST should return JSON which includes and access token value. Add the access token to the authorization header:

Authorization: Bearer G1ZsPGjPF6qJO8Sd5HctnKqNk_8KDc-………Lpy9P8sDWdECziihaPWyseug9hgD119keoZuh4B

This should give you a 200 response with data. And there you have it – you’ve successfully secured your web api so that it requires the user to be authenticated using Azure Active Directory.

Improving the Azure Active Directory Sign-on Experience

Improving the Azure Active Directory Sign-on Experience

I was talking to a customer the other day and had to log into the Azure portal. Normally when I launch the portal I’m already signed in and I’m not prompted but for whatever reason this time I was prompted to authenticate. Doing this in front of the customer lead to three interesting discussions:

– Use of two factor authentication to secure sign in
– Separate global administrator account for primary organisation tenant
– Company branding for Azure AD sign in

Firstly, the use of two factor authentication (TFA) is a must requirement for anyone who is using the Azure portal – if you are an administrator of your organisation, please make sure you enforce this requirement for anyone accessing your tenant/directory/subscription. This applies to staff, contractors, guests etc who might be using your Azure portal or the Office 365 portal. In fact, in this day in age, I would be enforcing two factor authentication for all employees – note that Outlook and Skype for Business are still stuck in the dark-ages and don’t access TFA sign in. For these you’ll need to generate an application password (go to https://myapps.microsoft.com, click on your profile image in top right corner and select “Profile”, click through to “Additional security verification,” click on the “app passwords” tab and then click “Create” to generate an app password.

Ok, next is the use of a separate global administrator account – this is in part tied to the previous point about using TFA. If you’re a global administrator of your tenant and you enable TFA, you won’t be able to generate app passwords. This is essentially forcing you down the path of best practice, which is to have a separate account which is the global administrator for your tenant. If other people in your organisation need administrative permissions, you can do this on a user or role basis within the Azure portal – our preference is to assign permissions to a resource group but there is enough fidelity within the portal to control access at the level you desire.

The other thing we’ve also enforced is that we do not host any Azure resources in our primary tenant (ie in our case builttoroam.com). Given the importance of Office365 based services we felt it important that we isolate off any resources we create in Azure to make sure they’re completely independent of our primary tenant. The only exception to this is if we are building internal LOB applications (ie only apps for Built to Roam use) – for these we include the app registrations within the builttoroam.com tenant so that we can restrict sign in and at the same time deliver a great sign in experience for our employees. For example we’re using Facebook Workplace (https://workplace.fb.com/) – we configured this within the builttoroam.com tenant in Azure AD to allow for a SSO experience.

Now, onto the point of this post – the last thing that came out of signing into the portal in front of the customer was that they were startled when we went to sign into the portal and our company branding appeared. To illustrate, when you first land on the portal sign in page you see:

image

After entering my email address, the sign in page changes to incorporate the Built to Roam branding

image

This not only improves the perception (for internal and external users), it also gives everyone a sense of confidence that they’re signing into a legitimate Built to Roam service.

In order to set this up, you need to navigate to the Active Directory node in the Azure portal and click on the Company branding. If you’re using Office 365 you should already have access to this tab. However, if you’re not, you may need to sign up for Active Directory Premium – you can get started using the Premium trial:

image

Once you’ve opened the Company branding tab (if you have just activated the trial, you may need to wait a few minutes and/or sign out and back in again in order for the Company branding tab to open) you can click on the link to “Configure company branding now”

image

There are a number of options and images that you can configure:

image

After saving the changes, if you attempt to sign in, you’ll notice the new images/colours etc appear. In this case, you can see that the welcome text at the bottom of the sign in page has been changed to what I entered in the company branding tab. Unfortunately because I didn’t set the sign in page image, the default is used, so you can’t see the red (#FF0000) background I set – you can see glimpses of it if you resize the page. This can be fixed by simply uploading a transparent image.

image

The ability to customise the sign in experience is just one way to improve the experience for you staff and customers.

Useful OAuth, OpenID Connect, Azure Active Directory and Google Authentication Links

Useful OAuth, OpenID Connect, Azure Active Directory and Google Authentication Links

Over the past couple of weeks I’ve been assisting with the development work of an enterprise system that uses both Azure Active Directory (Azure AD) and Google to authenticate users. It’s a cross platform solution which means we need code that works across both authentication platforms, and the three mobile platforms. Unfortunately this is easier said than done – The Azure AD team have done a reasonable job with the ADAL library but it’s not like we can repurpose that library for authenticating against Google. This is a tad annoying since both Azure AD and Google both use OAuth and OpenID Connect, so you’d expect there to be a good library that would work across both.

In trying to find a workable solution I can across a number of links that I want to bookmark here for future reference:

OAuth 2

Home – https://oauth.net/2/

The OAuth home page is a good starting point if you want to get more links and information about OAuth (1 and 2) but I actually found it’s main use for me was to point at the OAuth 2.0 Framework RFC

OAuth 2.0 Framework RFC – https://tools.ietf.org/html/rfc6749

You can think of the OAuth 2.0 Framework RFC as being the specification for OAuth 2.0. There are some extensions and other standards that relate to OAuth 2.0 but this is a must read if you want to understand what OAuth 2.0 is all about. You may need to refer back to this when reading other blogs/tutorials as it can help clarify what each of the roles and responsibilities are in the process.

Simple overview of OAuth 2 – https://aaronparecki.com/2012/07/29/2/oauth2-simplified

This overview provides a quick summary of the various flows for OAuth 2.0. However, I disagree with the use of the implicit workflow for mobile applications. Whilst mobile applications are not “trusted,” which would normally imply the use of the implicit workflow, the reality is that the implicit workflow can’t issue refresh tokens. This means that unless you want your users to have to log in each time they use your mobile application, you need to use the Authorization Code workflow (the client secret shouldn’t be required when requesting access tokens for mobile apps – this depends on which authentication provider you’re using).

 

OpenID Connect

Home – http://openid.net/connect/

The OpenID Connect home page is again a good starting point as it links to the many different parts of the OpenID Connect standard. OpenID Connect builds on top of OAuth 2.0 in order to provide a mechanism for users to be authenticated as well as authorized for resource access. In addition to the creation of access tokens, OpenID Connect defines an id_token which can be issued in absence of any resource that is just used to identify the user that has authenticated.

OpenID Connect Core 1.0 – http://openid.net/specs/openid-connect-core-1_0.html

This is the core specification of OpenID Connect. Similar to the specification for OAuth, this is worth both a read and to be used as a reference when working with OpenID Connect implementations.

OpenID Connect Session Management 1.0 – http://openid.net/specs/openid-connect-session-1_0.html

Whilst still in draft this standard covers how implementers are supposed to handle log out scenarios, which is useful as your application can’t simply delete it’s access tokens when a user opts to log out. Ideally when a user logs out, you’d want to make sure both cached tokens are cleared, along with invalidating any access or refresh tokens.

 

Google

OAuth 2.0 Overview – https://developers.google.com/identity/protocols/OAuth2

OpenID Connect – https://developers.google.com/identity/protocols/OpenIDConnect

Google’s documentation isn’t too bad but does require you to read all of the pages as the OAuth and OpenID Connect implementation details seem to be scattered across the pages. The assumption is that for any given type of application you can simply read the one page – unfortunately, if you want to get an understanding of the Google implementation, you really need to read everything. Authenticating/authorizing with Google is significantly simpler than with Azure AD as there is no notion of linking your application registration with specific permissions to other applications registered with Azure AD. This is a significant limitation of using Google sign in, as you can really only use it to authenticate and then use the token to access various Google APIs.

 

Azure Active Directory

Azure AD Developer’s Guide – https://docs.microsoft.com/en-au/azure/active-directory/develop/active-directory-developers-guide

Authentication Scenarios for Azure AD – https://docs.microsoft.com/en-au/azure/active-directory/develop/active-directory-authentication-scenarios

Azure AD is a much longer read, and it’s very easy to get lost in the world of application configuration and settings. My recommendation is to start with something simple, and then grow from that. For examples, start by authenticating a use to sign into your mobile app, then extend it so that you can use the access token to connect to a Web API, and then on to retrieve information from other Microsoft services within the Web API, and then perhaps make it all multi-tenanted (that’s one for another post!).