Nick's .NET Travels

Continually looking for the yellow brick road so I can catch me a wizard....

Source Code for Real Estate Inspector Sample on GitHub

I’ve got around to publishing the current source code pieces to GitHub. It’s currently very hotch-potch as I’ve been focussing on demonstrating/fleshing out a lot of the concepts for my blog posts. Over the coming weeks I’ll be extending out the actual functionality and will periodically update the code on GitHub. For now, it’s a good place to pick through the various code I’ve been talking about over the last couple of months

Using a Refresh Token to Renew an Expired Access Token for Azure Active Directory

Currently my application attempts to acquire the access token silently which equates to looking to see if there is a current (ie not expired) token in the token cache. However, tokens don’t live for very long, so it’s quite likely that a token won’t be found. This unfortunately leads to a poor user experience as the user will quite often be prompted to sign in. There is an alternative, which is to use the refresh token, returned as part of initially acquiring the access token, to silently request a new access token. This of course is on the assumption that the refresh token hasn’t expired.

Here is a quick summary, as at the time of writing, of the different tokens and their expiry rules (a good explanation here):

  • Azure AD access tokens expire in 1 hour (see the expires_on attribute that is returned when acquiring an access token).
  • Refresh tokens expires in 14 days (see the refresh_token_expires_in attribute that is returned when acquiring an access token).
  • Access tokens can be refreshed using the refresh-token for a maximum period of time of 90 days, from the date that the access token was acquired by prompting the user.

The authentication logic can be amended to retrieve the list of refresh tokens, attempt to acquire token silently, followed by an attempt to acquire token via the refresh token. Failing that the user would be prompted to sign in.

var authContext = new AuthenticationContext(Configuration.Current.ADAuthority);

var tokens = authContext.Tokens();
var existing = (from t in tokens
                where t.ClientId == Configuration.Current.ADNativeClientApplicationClientId &&
                        t.Resource == Configuration.Current.MobileServiceAppIdUri
                select t).FirstOrDefault();
if (existing != null)
{
    try
    {
        var res = await authContext.AcquireTokenSilentAsync(
            Configuration.Current.MobileServiceAppIdUri,
            Configuration.Current.ADNativeClientApplicationClientId);
        if (res != null && !string.IsNullOrWhiteSpace(res.AccessToken))
        {
            return res.AccessToken;
        }
    }
    catch (Exception saex)
    {
        Debug.WriteLine(saex);
    }

    try
    {
        var res = await
            authContext.AcquireTokenByRefreshTokenAsync(existing.RefreshToken,
                Configuration.Current.ADNativeClientApplicationClientId);
        if (res != null && !string.IsNullOrWhiteSpace(res.AccessToken))
        {
            return res.AccessToken;
        }
    }
    catch (Exception saex)
    {
        Debug.WriteLine(saex);
    }

}

Azure Active Directory with Mobile Services without Prompting Every Time the Application Starts

Currently, every time the application is run the user is prompted to sign into Azure Active Directory, and then the AD issued token is then used to login to Azure Mobile Service. Not only is this a pain for the user (for example if they’ve only just been in the application, to have to sign in again feels somewhat unnecessary), it also adds latency on startup as well as preventing the application from running when offline. In the next couple of posts I’ll look at a couple of techniques to consider in order to improve this sign on experience.

Firstly, it’s worth noting that there was an update posted for the Azure Active Directory Authentication library (ADAL) on NuGet – it’s still prerelease but worth updating to if you’re using v3 of the library. More info on NuGet, here.

One of the nice things about ADAL is that it provides a cache for tokens. In addition to being able to query what tokens are in the cache (for example in order to then login to the Mobile Service) it also wraps the check to determine if a token is still valid. To do this, I can call AcquireTokenSilentAsync to authenticate silently ie without prompting the user. If a valid access token is found in the token cache it will be returned. In the case that no valid token is found and exception is raised and I then need to invoke AcquireTokenAsync as I did previously.

var authContext = new AuthenticationContext(Configuration.Current.ADAuthority);
try
{
    var res = await authContext.AcquireTokenSilentAsync(
        Configuration.Current.MobileServiceAppIdUri,
        Configuration.Current.ADNativeClientApplicationClientId);
    if (res != null && !string.IsNullOrWhiteSpace(res.AccessToken))
    {
        return res.AccessToken;
    }
}
catch (Exception saex)
{
    Debug.WriteLine(saex);
}

As Windows Phone 8.0 isn’t supportedyet by v3 of ADAL, I also need to update my custom implementation of the AuthenticationContext. Firstly, to add a static list of previously acquired tokens:

private static readonly List<AuthenticationResult> Tokens = new List<AuthenticationResult>();  

Next, I need to update my AuthenticationResult option to decode more than just the access and refresh tokens:

public class AuthenticationResult
{
    private static DateTime epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc).ToLocalTime();

    public string ClientId { get; set; }

    [JsonProperty("access_token")]
    public string AccessToken { get; set; }

    [JsonProperty("refresh_token")]
    public string RefreshToken { get; set; }

    [JsonProperty("resource")]
    public string Resource { get; set; }

        [JsonProperty("expires_on")]
    public long ExpiresOnSeconds { get; set; }

    public DateTime ExpiresOn
    {
        get { return epoch.AddSeconds(ExpiresOnSeconds); }
    }

    private long refreshTokenExpiresInSeconds;
    [JsonProperty("refresh_token_expires_in")]
    public long RefreshTokenExpiresInSeconds
    {
        get { return refreshTokenExpiresInSeconds; }
        set
        {
            refreshTokenExpiresInSeconds = value;
            RefreshTokensExpiresOn = DateTime.Now.AddSeconds(refreshTokenExpiresInSeconds);
        }
    }

    public DateTime RefreshTokensExpiresOn { get;private set; }

}

At the end of the AcquireTokenAsync method I need to set the ClientId property on the AuthenticationResult (so I know which AAD client id it was returned for as this isn’t returned in the response) and add the result to the Tokens list:

var result = JsonConvert.DeserializeObject<AuthenticationResult>(await data.Content.ReadAsStringAsync());
result.ClientId = ClientId;
Tokens.Add(result);
return result;

Finally I need to implement the AcquireTokenSilentAsync method – although it doesn’t require async/Task I’ve kept the method consistent with ADAL to avoid conditional code when calling the method

public async Task<AuthenticationResult> AcquireTokenSilentAsync(
    string resource,
    string clientId)
{
   

    var result = (from t in Tokens
                        where t.ClientId == clientId &&
                                   t.Resource == resource &&
                                   t.ExpiresOn > DateTime.Now
                        select t).FirstOrDefault();

    return result;
}

Note that this implementation doesn’t persist the access token beyond the current session. However, it will avoid the need to reauthenticate if the user does happen to do something that would otherwise require authentication.

Communication and Synchronization using Background Tasks with Windows and Windows Phone 8.1

In an earlier post I covered creating a background task for Windows platform applications that would allow for synchronization in the background, triggered by change in Internet availability. However, one thing I didn’t cover is what happens when the application is in use, in this case, not only might the user be interacting with the data (loading and writing new records), they might have even forced a synchronization within the application. For this reason it may be necessary to prevent background synchronization whilst the foreground application is being used.

If you haven’t worked with Windows/Windows Phone 8.1 before the OnActivated method might be confusing in the Application class – it’s not the same as the Activated/Deactivated pair from Windows Phone 7x/8.0 where they occurred reliably when the application went into the background. For Windows/Phone 8.1 you need to look at the Window VisibilityChanged event to detect when the Window goes into the background (note that there is an issue with this on Windows 10 where it isn’t triggered when you switch to another application as all applications are windowed by default).

protected override void OnWindowCreated(WindowCreatedEventArgs args)
{
    args.Window.VisibilityChanged += WindowVisibilityChanged;
}

In the event handler I’m going to take ownership of a named Mutex when the Window is visible, and release it when the Window is no longer visible:

private static Mutex backgroundMutex = new Mutex(false,"BackgroundSync");
private async void WindowVisibilityChanged(object sender, Windows.UI.Core.VisibilityChangedEventArgs e)
{
    if (e.Visible)
    {
        backgroundMutex.WaitOne();

        await ReregisterTasks();
    }
    else
    {
        backgroundMutex.ReleaseMutex();
    }
}

You’ll notice that after taking ownership of the Mutex I then reregister my background task:

private async Task ReregisterTasks()
{
    try
    {
        BackgroundTaskManager.UnregisterBackgroundTasks("Background Sync Task", true);
        BackgroundTaskRegistration taskReg =
            await BackgroundTaskManager.RegisterBackgroundTask(typeof(SyncTask).FullName,
                "Background Sync Task",
                new SystemTrigger(SystemTriggerType.InternetAvailable, false),
                null);
    }
    catch (Exception ex)
    {
        // Swallow this to ensure app doesn't crash in the case of back ground tasks not registering
        Debug.WriteLine(ex.Message);
    }
}

The reason for this is that we want to cancel any running background task (the “true” parameter passed into the UnregisterBackgroundTasks call).

In the background task I have a Mutex with the same name (Named Mutexes are shared across processes so a great way to communicate between foreground/background tasks). At the beginning of the Run method I attempt to acquire the Mutex: if this succeeds I know my foreground application isn’t visible; if this fails (which it will do immediately since I specified a wait time of 0) it will simply return from the task as we don’t want to sync whilst the foreground application is visible. If the background task is going to run, I immediately release the Mutex which will ensure that if the foreground application is made visible, or launched, it won’t be blocked waiting to acquire the Mutex.

private static Mutex foregroundMutex = new Mutex(false,"BackgroundSync");
public async void Run(IBackgroundTaskInstance taskInstance)
{
    try
    {
        if (!foregroundMutex.WaitOne(0)) return;
        foregroundMutex.ReleaseMutex();
        // do the rest of background task

It’s important within your background task to override and handle the OnCanceled method so that if the background task is executing when it is unregistered, the task can be cancelled gracefully.

Custom Domains for Azure Mobile Services

As packaging a cloud based solution one of the tasks is to change the configuration of the services so that they have a application specific domain. In the case of Azure websites this feature has been available for quite a while in the form of custom domains. However, it was only recently that this capability was added to Azure Mobile Services. This enables me to change the Mobile Service url from https://realestateinspector.azure-mobile.net to https://realestate.builttoroam.com. This capability is only available to Mobile Services running in Standard mode, which can be quite a costly commitment if custom domains are the only reason to upgrade.

Here’s a quick run through of setting up a custom domain. Note that this doesn’t include setting up SSL for your custom domain, which is highly recommended. There is more information here that includes using wildcard SSL certificates, which might be useful if you are packaging multiple services (eg Mobile Service and a Website) off the same base domain.

The first thing to do is to setup a CName record (alternatively you can setup an A record using these instructions) – this needs to be done with the name service that hosts the DNS records for your domain.

image

If you simply try to browse to the new URL you’ll see quite a useful 404 message. The first option is exactly the scenario I now face – I have to configure the Mobile Service to know about the custom domain.

image

Currently there is no UI in the Azure portal for managing custom domains for Mobile Services, unlike for Azure Websites where it can all be configured in the portal. Instead, I need to use the Azure CLI. Before doing this, make sure you are using v0.8.15 or higher (v0.8.15 is current at time of writing). Note that I ran into some issues upgrading the Azure CLI – docs online suggest using npm (eg npm update azure-cli, or npm update azure-cli –g depending on whether you installed the azure-cli globally or not). However, I found that this wasn’t working – the output suggested it had updated to 0.8.15 but when I queried azure –v I saw an earlier version. Turns out that I’d installed the azure-cli via the Web Platform Installer – in this case you either need to uninstall the azure-cli via the platform installer, or simply install the new version via the platform installer (which is what I did).
Adding a custom domain is then relatively straight forward: azure mobile domain add <mobileservicename> <customdomain> eg

image

Now when you browse to the new url you see the typical Mobile Service status homepage.

image

When I run my client applications I need to update the Mobile Service Client URL to point to the new url. I can then see in Fiddler that the traffic is indeed going to the new custom domain.

image

Database Migrations with Package Manager Console and Azure Mobile Services

I was caught out recently after I published an incorrect database migration into my cloud base Azure Mobile Service (I created a second controller based on the RealEstateProperty entity instead of the PropertyType entity). The upshot is that I only noticed this when all the properties of the entities I was synchronizing down from the cloud came back with null for most of their properties. Initially I thought my issue was with the migration I had performed on the database, so I thought I’d roll back to a previous version. My most recent migration was “201502260615263_Added proeprty type entity” and I wanted to roll it back to the previous migration, “201501061158375_AddedInspections”. To do this you can simply call the update-database method in the Package Manager Console:

update-database –TargetMigration “201501061158375_AddedInspections”

However, I wanted to invoke this only the database for the Mobile Service running in the cloud. To do this I need to add the –ConnectionString and –ConnectionProviderName attributes. The latter is easy as it needs to be the static value “System.Data.SqlClient” but the former requires two steps:

- In the Azure Management Portal go to the SQL Databases tab and then select the database that correlates to the Mobile Service. With the database selected, click “Manage” from the toolbar – this will prompt to add a firewall rule allowing access from your computer (this only happens the first time or again if your ip address changes). You need to add this firewall rule as Visual Studio will be attaching directly to the database to run the code-first migration on the database.

image

- From the Dashboard pane of the SQL Server database, select Connection Strings from the right link menu, and copy the contents of the ADO.NET connection string.

image

Now I can add the connection string to the update-database method:

update-database –TargetMigration “201501061158375_AddedInspections” –ConnectionString “Server=tcp:p7zzqmjcmf.database.windows.net,1433;Database=realestateinspector;User ID={my username};Password={your_password_here};Trusted_Connection=False;Encrypt=True;Connection Timeout=30;” –ConnectionProviderName “System.Data.SqlClient”

I checked that this had removed the PropertyType table (which was part of the migration I just reversed) and then removed the old migration file, “201502260615263_Added proeprty type entity.cs”, and then regenerated the new migration by calling add-migration again:

add-migration ‘Added proeprty type entity’

Given that the Mobile Service itself hadn’t changed at that point I figured that I’d simply call update-database without the TargetMigration parameter but with the ConnectionString that points to my actual Mobile Service. This seemed to go ok but then when I ran my Mobile Service and attempted to synchronize my PropertyType entities – this caused an exception because I’d discovered the root of my issue, which was I had two controllers both referencing the RealEstateProperty entity. I fixed that and republished my Mobile Service. Now synchronization worked, but mainly because there were no entities in the PropertyType table in the database, so I then attempted to add a PropertyType using the direct access (rather than synchronizing entities) in the MobileServiceClient (using GetTable instead of GetSyncTable) – this caused some weird exception as it seemed to require that the CreatedAt property be set. I’ve never had to do this on previous inserts, so I sensed something was wrong. Using the Visual Studio 2015 CTP I connected directly to the SQL Server database and sure enough on my PropertyType table there were no triggers for insert/update. Usually this is where the CreatedAt column is updated.

So, feeling a little puzzled I decided to undo my migration on my Mobile Service database once more. But this time, instead of attempting to change any of the migration scripts, all I did was republish my Mobile Service. Now when I attempted to add a PropertyType it worked, no problems. Checking with Visual Studio 2015, the trigger on the PropertyType table had been successfully created. At this point I’m not sure what exactly happens when the Mobile Service runs but it seems to do more than just applying the code-first migrations. It definitely seems to me that updating the cloud database using the package manager console seemed to skip the validation step that Mobile Services does in order to add the appropriate triggers, and thus should be avoided.

Multiple Bootstrapper in WebApiConfig for Mobile Service

In my “wisdom” I decided to rename the primary assembly for my Mobile Service (ie just changing the assembly name in the Properties pane for the Mobile Service).

image

This all worked nicely when running locally but when I published to Azure I started seeing the following error in the Log, and of course my service wouldn’t run…

Error: More than one static class with name 'WebApiConfig' was found as bootstrapper in assemblies: RealEstateInspector.Services, realestateinspectorService. Please provide only one class or use the 'IBootstrapper' attribute to define a unique bootstrapper.

Turns out that when I was publishing I didn’t have the “Remove additional files at destination” box checked in the Publish Web dialog. This meant that my old Mobile Service assembly (ie with the old name) was still floating around. As reflection is used over assemblies in the bin folder to locate the bootstrapper, it was picking up the same class in both assemblies…. hence the issue.

image

Checking the “Remove additional files at destination” box ensures only those files that are currently in your Mobile Service project are deployed.

Azure Active Directory Graph API and Azure Mobile Service

Last month in an earlier post I talked about using the Azure Active Directory Graph API Client library in my Azure Mobile Service. Whilst everything I wrote about does indeed when published to the cloud, it does raise a number of errors that are visible in the Log and the status of the service ends up as Critical – which is definitely something I don’t want. The error looks something like the following:

Error: Found conflicts between different versions of the same dependent assembly 'System.Spatial': 5.6.2.0, 5.6.3.0. Please change your project to use version '5.6.2.0' which is the one currently supported by the hosting environment.

Essentially the issue is that the Graph API references a newer version of some of the data libraries (System.Spatial, Microsoft.Data.OData, Microsoft.Data.Edm and Microsoft.Data.Services.Client to be exact). What’s unfortunate is that even using the runtime redirect in the web.config file to point to the newer versions of these library which are deployed with the service, the errors still appear in the log. As there essentially doesn’t seem to be any compatibility issues between the Graph API and the slightly older version (ie 5.6.2.0) I even tried downgrading the other libraries (you can use the –Force function in package management console to remove NuGet packages even if others are dependent on them, so I removed the new versions and added the old version back in) but of course Visual Studio then fails its validation checks during compilation.

The upshot is that you have to either:

- Wait for the Mobile Services team to upgrade their backend to support the new versions of these libraries…..personally I don’t understand why this causes an error in the logs and forces the service to critical, particularly since my service actually appears to be operating fine!

- Downgrade the Graph API Library back to the most recent v1 library – this references an older version of those libraries so has now issues. Unfortunately it doesn’t contain the well factored ActiveDirectoryClient class, making it harder to query AAD.

Migrating Data Between Blob Storage Accounts in Azure

Over the last couple of posts I’ve been talking about working with different configurations and in my previous post I noted that one of the things we had to do was to migrate some data that had been entered into the Test environment into Production environment (again I stress that I’m not recommending it but occasionally you have to bend the process a little). One of the challenges we encountered was that we not only had to migrate the database, which was easy using the database copy capability in the Azure portal, we also needed to migrate the related blob storage data from one account into another. Here’s some quick code that makes use of the Azure Storage client library (WindowsAzure.Storage package via NuGet and more information here).

Firstly in the app.config we have two connection strings:

<connectionStrings>
    <add name="BlobMigrator.Properties.Settings.SourceStorage"
      connectionString="DefaultEndpointsProtocol=https;AccountName=sourceaccount;AccountKey=YYYYYYYYYYY" />
    <add name="BlobMigrator.Properties.Settings.TargetStorage"
      connectionString="DefaultEndpointsProtocol=https;AccountName=targetaccount;AccountKey=XXXXXXXXXXXXXXX" />

</connectionStrings>

Next, some straight forward code to iterate through containers in one storage account and copy content across to the target account:

var source= CloudStorageAccount.Parse(Settings.Default.SourceStorage);
var target= CloudStorageAccount.Parse(Settings.Default.TargetStorage);

var sourceClient = source.CreateCloudBlobClient();
var targetClient = target.CreateCloudBlobClient();

var containers= sourceClient.ListContainers("searchprefix").ToArray();
Debug.WriteLine("Source containers: " + containers.Length);
var idx = 0;
foreach (var cnt in containers)
{
    var tcnt =targetClient.GetContainerReference(cnt.Name);
    await tcnt.CreateIfNotExistsAsync();

    var sblobs = cnt.ListBlobs();
    foreach (var sblob in sblobs)
    {
        var b = await sourceClient.GetBlobReferenceFromServerAsync(sblob.Uri);
        var tb = tcnt.GetBlockBlobReference(b.Name);
        var ok = await tb.StartCopyFromBlobAsync(b.Uri);
        Debug.WriteLine(ok);
    }
    idx++;
    Debug.WriteLine("Migrated {0} of {1} - {2}",idx,containers.Length,cnt.Name);
}

In this case it’s limiting the containers that are copied to those that start with the prefix “searchprefix” but this is optional if you want to copy all containers.

Client Configurations for Different Mobile Service Environments

In my previous post I talked about setting up different instance of the backend cloud services. The next thing is to control which environment a given build of the client applications will point to. I used to do this with build configurations (eg defining compilation symbols like DEBUG and TEST) to toggle which Constants are compiled into the application. This is a little painful if you want to actually debug against production (ie you want to run the application pointing at production data but in debug mode so you can step through code). It also meant that my configuration information quickly became distributed all over the place in my Constants file. I’m going to include here a simple Configuration class that I’m now using as an alternative. Note that I actually still use compilation constants as the default way of specifying which build configuration is used. However it can easily be overridden to allow for debugging against test or production.

public class Configuration
{
    static Configuration()
    {
#pragma warning disable 162 // This is to allow for easy override of configuration values to debug issues
        if (false)
        {
            Debug.WriteLine("-----------------WARNING - Default Configuration Values Overridden ------------------");
            Current = Configurations[ConfigurationType.Production];
        }
#pragma warning restore 162

#if DEBUG
#if DEBUGLOCAL
        Current = Configurations[ConfigurationType.LocalDevelopment];
#else
        Current = Configurations[ConfigurationType.Development];
#endif
#elif TEST
        Current = Configurations[ConfigurationType.LocalDevelopment];
#else
        Current = Configurations[ConfigurationType.Production];
#endif
    }

    public static Configuration Current { get; set; }

    public string ADTenant { get; set; }

    public string ADAuthority
    {
        get { return ClientConstants.ADAuthorityRoot + ADTenant; }
    }

    public string ADNativeClientApplicationClientId { get; set; }

    public string ADRedirectUri { get; set; }

    public string MobileServiceRootUri { get; set; }
    public string MobileServiceAppIdUri { get; set; }

    public string MobileServiceApiKey { get; set; }

    public enum ConfigurationType
    {
        LocalDevelopment,
        Development,
        Test,
        Production
    }

    public static IDictionary<ConfigurationType, Configuration> Configurations
    {
        get { return configurations; }
    }

    private static readonly IDictionary<ConfigurationType, Configuration> configurations
        = new Dictionary<ConfigurationType, Configuration>
        {
            {
                ConfigurationType.LocalDevelopment, new Configuration
                {
                    ADTenant = "realestateinspector.onmicrosoft.com",
                    ADNativeClientApplicationClientId = "a5a10ee9-zzzzzzz-4bde-997f-3f1c323fefa5",
                    ADRedirectUri = "
http://builttoroam.com",
                    MobileServiceRootUri = "
http://localhost:51539/",
                    MobileServiceAppIdUri =
https://realestateinspectordev.azure-mobile.net/login/aad,
                    MobileServiceApiKey="wpxaI---------------------------EBcg12"
                }
            },
            {
                ConfigurationType.Development, new Configuration
                {
                    ADTenant = "realestateinspector.onmicrosoft.com",
                    ADNativeClientApplicationClientId = "a5a10ee9-zzzzzzz-4bde-997f-3f1c323fefa5",
                    ADRedirectUri = "
http://builttoroam.com",
                    MobileServiceRootUri =
https://realestateinspectordev.azure-mobile.net/,
                    MobileServiceAppIdUri =
https://realestateinspectordev.azure-mobile.net/login/aad,
                    MobileServiceApiKey="wpxaI---------------------------EBcg12"
                }
            },
            {
                ConfigurationType.Test, new Configuration
                {
                    ADTenant = "realestateinspector.onmicrosoft.com",
                    ADNativeClientApplicationClientId = "a5a10ee9-tttt-4bde-997f-3f1c323fefa5",
                    ADRedirectUri = "
http://builttoroam.com",
                    MobileServiceRootUri =
https://realestateinspectortest.azure-mobile.net/,
                    MobileServiceAppIdUri =
https://realestateinspectortest.azure-mobile.net/login/aad,
                    MobileServiceApiKey="wpxaI---------------------------EBcg12"
                }
            },
            {
                ConfigurationType.Production, new Configuration
                {
                    ADTenant = "realestateinspector.onmicrosoft.com",
                    ADNativeClientApplicationClientId = "a5a10ee9-wwww-4bde-997f-3f1c323fefa5",
                    ADRedirectUri = "
http://builttoroam.com",
                    MobileServiceRootUri = "
https://realestateinspector.azure-mobile.net/",
                    MobileServiceAppIdUri = "
https://realestateinspector.azure-mobile.net/login/aad",
                    MobileServiceApiKey="wpxaI---------------------------EBcg12"
                }
            }
        };
}

Adding a new configuration is easy:
- define a new enumeration value (eg LocalDevelopment)
- create a new entry in the Configurations dictionary
- (optional) test by changing “if (false)” to “if (true)” and changing the specified configuration

Different Cloud Environments for Development, Testing and Production

One of the aspects of developing applications that have a cloud backend that gets overlooked initially is how to separate development from test and production versions of the application. For web applications ASP.NET solved this by supporting transformations in the web.config file based on build configuration (eg web.Debug.config and web.Release.config). However, this issue is harder with client applications that don’t have config files and don’t understand configuration transformations. The other issue with transformations is that they’re only applied during the publishing process, rather than simply when you change the build configuration in Visual Studio.

I’ll come back to talk about how I’ve chosen to handle different application configurations in a later post. In this post I want to discuss how we’ve handled having multiple environments for our Mobile Service backend; this includes how we decided to do this working with our development team v’s the client site.

Our strategy was to have three environments: Development, Testing and Production. Development was housed within the Built to Roam development Azure subscription which the development team have access to. For the most part anyone within the development team could deploy to this environment at any stage – of course there was some self management involved to minimize breaking changes. As an aside, as I’ve pointed out in a previous post, it is possible to set up Mobile Services to run locally, even if you enable Azure Active Directory authentication. The Development environment was also based on an Azure Active Directory (AAD) tenant explicitly created for the development of that project – that way accounts could be added/removed without affecting any other AAD.

Test and Production were both created in the customers Azure subscription. This was to minimize differences between these environments. These environments also connected to the customers AAD which meant that testing could be carried out with real user accounts since their AAD was synchronized with their internal AD. In a case where writing is supported back to AAD you may want to consider having test pointing to a separate AAD instance but for our purposes AAD was read only so there was no issue in using the same AAD tenant for both Test and Production.

For each of these we created a separate Mobile Service, named according to the environment, with production being the exception as we decided to drop the “production” suffix. Taking the RealEstateInspector example our services would be called:

Development – RealEstateInspectorDev
Testing – RealEstateInspectorTest
Production – RealEstateInspector

Note that we shortened both Development and Testing to just Dev and Test for simplicity.

We also created corresponding storage accounts, with names that matched the names of the mobile service

We also created corresponding applications in the appropriate Azure Active Directory, again with names that matched the corresponding environment. We didn’t use the same applications for Testing and Production to ensure we could configure them separately if required.

One issue we faced is that during the first iteration of development as the system was undergoing final testing in the Testing environment some real data was entered into the system. This meant that rather than simply deploying to Production we actually needed to migrate data from Testing to Production (definitely not something I would recommend as best practice). To do this was actually relatively simple using the ability in Azure to copy a SQL database and then within the Mobile Service change the database that it points to. We also had to migrate content from one storage account to another for which we couldn’t find a simple out of the box tool to use. However, this was actually much simpler than we thought and I’ll come back to this in a future post.

Adding Logging to Client Applications using MetroLog not NLog

I wanted to add some logging to my set of applications and was somewhat disappointed to discover the complete lack of PCL support in NLog. After a quick search to find out what others are using I came across MetroLog which seems to be very similar in a lot of regards to NLog (in fact they claim the API surface is very similar indeed). I went to install the NuGet package…

image

and noticed that my Core library (a PCL) wasn’t in the list. Clearly my PCL profile doesn’t match one of the supported profiles which is a bit painful. However, it does support all my client application types so I was happy at least to use MetroLog.

image

I did have a couple of issues installing the NuGet package, namely Visual Studio decided to crash mid way through installing the packages. This meant I had to manually install Microsoft.Bcl.Compression which is a dependency for the Windows Phone 8.0 project, and then uninstall and reinstall the support for the Windows and Desktop projects.

After all this I was building successfully again so it was time to think about how to structure the logging support. Clearly with logging I want it to be a simple as possible and yet accessible virtually anywhere within the application. I want to define a service that is available within my Core library in a similar way to my IDataService and ISyncService implementation. I also wanted the log output to be written out to a sqlite database file for ease of access (there are plenty of third party tools capable of viewing sqlite db files) but rather than use the SqliteTarget that comes with MetroLog I felt I had to write my own (as you do). Luckily this whole process is relatively simple.

I start by creating an ILogWriterService interface which will provide the low level API for writing a LogEntry to a local Mobile Service sqlite table (I’m going to use the same process that is used to cache my synchronized data, except for the time being at least, the data won’t be synchronized anywhere).

public interface ILogWriterService
{

    IMobileServiceSyncTable<LogEntry> LogTable { get; }
    Task Initialize();
}

public class LogEntry : BaseEntityData
{
    public DateTime LogDateTime { get; set; }
    public string Entry { get; set; }
}

public class LogWriterService : ILogWriterService
{
    private readonly MobileServiceClient mobileService = new MobileServiceClient(Constants.MobileServiceRootUri);

    private IMobileServiceClient MobileService
    {
        get { return mobileService; }
    }

    public IMobileServiceSyncTable<LogEntry> LogTable
    {
        get { return MobileService.GetSyncTable<LogEntry>(); }
    }

    public async Task Initialize()
    {
        var data = new MobileServiceSQLiteStore("log.db");
        data.DefineTable<LogEntry>();

        await MobileService.SyncContext.InitializeAsync(data, new MobileServiceSyncHandler());
    }
}

Next I define the high level service interface, ILogService:

public interface ILogService
{
    void Debug(string message);

    void Exception(string message, Exception ex);
}

So far, all of these classes have been in the Core library. However, the implementation of the ILogService has to be in the Shared.Client project as it needs to be used by all the client projects.

public class LogService : ILogService
{
    public ILogWriterService Writer { get; set; }

    public ILogger Logger { get; set; }

    public LogService(ILogWriterService writer)
    {
        Writer = writer;
        var target = new MobileServiceTarget(Writer);

        LogManagerFactory.DefaultConfiguration.AddTarget(LogLevel.Debug, target);

        Logger = LogManagerFactory.DefaultLogManager.GetLogger("Default");
    }

    public void Debug(string message)
    {
        Logger.Debug(message);
    }

    public void Exception(string message, Exception ex)
    {
        Logger.Error(message, ex);
    }
}

As you can see the implementation sets up the MetroLog logger but uses a custom MobileServiceTarget as the destination. This is implemented as follows:

public class MobileServiceTarget : Target
{
    public ILogWriterService Writer { get; set; }
    public MobileServiceTarget(ILogWriterService writer)
        : base(new SingleLineLayout())
    {
        Writer = writer;
    }

    private bool InitCompleted { get; set; }
    protected async override Task<LogWriteOperation> WriteAsyncCore(LogWriteContext context, LogEventInfo entry)
    {
        try
        {
            if (!InitCompleted)
            {
                await Writer.Initialize();
                InitCompleted = true;
            }
            var log = new LogEntry { LogDateTime = DateTime.Now, Entry = entry.ToJson() };
            await Writer.LogTable.InsertAsync(log);
            return new LogWriteOperation(this, entry, true);
        }
        catch (Exception ex)
        {
            return new LogWriteOperation(this, entry, false);
        }
    }
}

I of course need to register the implementations with Autofac:

builder.RegisterType<LogWriterService>().As<ILogWriterService>();
builder.RegisterType<LogService>().As<ILogService>();

And the last thing is a static helper class that makes logging for two core scenarios really easy:

public static class LogHelper
{
    public static void Log<TEntity>(TEntity entity, [CallerMemberName] string caller = null)
    {
        var json = JsonConvert.SerializeObject(entity);
        Log(typeof(TEntity).Name + ": " + json, caller);
    }

    public static void Log(string message = null, [CallerMemberName] string caller = null)
    {
        try
        {
            InternalWriteLog("[" + caller + "] " + message);
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex.Message);
        }
    }

    public static void Log(this Exception ex, string message = null, [CallerMemberName] string caller = null)
    {
        try
        {
            Debug.WriteLine("Exception ({0}): {1}", caller, ex.Message);
            InternalWriteException(caller + ": " + message, ex);
        }
        catch (Exception ext)
        {
            Debug.WriteLine(ext.Message);
        }
    }

    private static ILogService logService;

    private static ILogService LogService
    {
        get
        {
            if (logService == null)
            {
                logService = ServiceLocator.Current.GetInstance<ILogService>();

            }
            return logService;
        }
    }

    private static void InternalWriteLog(string message)
    {
        try
        {

            LogService.Debug(message);
        }
        catch (Exception ext)
        {
            Debug.WriteLine(ext.Message);
        }
    }

    private static void InternalWriteException(string message, Exception ex)
    {
        try
        {
            LogService.Exception(message, ex);
        }
        catch (Exception ext)
        {
            Debug.WriteLine(ext.Message);
        }
    }
}

The first scenario is a simple string output eg:

LogHelper.Log("Startup complete");

The second is logging the output of an Exception:

try
{
   …
}
catch (Exception ex)
{
    ex.Log();
}

Note that the Exception logging also does a Debug.WriteLine which is convenient during development to pick up any issues in the Output window.

Creating Design Time Data in Blend for Shared XAML Pages

In a previous post I created a second page for my Universal (Windows/Windows Phone) applications which was placed in the Shared project. Unfortunately Blend doesn’t support design time data for XAML pages that are in shared projects. However, there is a trick to get the design time data wired up and displaying for these pages.

I’ll start by opening the MainPage for the Windows application, which is in the Windows project as it’s not shared with the Windows Phone application. From the Data pane I can go ahead and create a sample data set. There are different ways to partition your sample data – I prefer to have a sample data set per page; in this case the data is for SecondPage, which is in the shared project.

image

In this case my design time data is just going to be made up of a single complex entity, CurrentProperty

image

Next I am going to remove the SecondPageDataSource that was created on the MainPage and add it instead to the Application.Resources section in the App.xaml. Note that I needed to add the d, mc and SampleData namespaces.

xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
    xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
    xmlns:SampleData="using:Blend.SampleData.SecondPageDataSource"
    mc:Ignorable="d">
    <Application.Resources>
        <SampleData:SecondPageDataSource x:Key="SecondPageDataSource" d:IsDataSource="True"/>

Now when I open up the SecondPage I’ll see that there is a SecondPageDataSource under the Data pane.

image

One thing to be aware of is that this will only exist when designing the Windows application. Unfortunately you’ll need to create a different sample data set for use when designing the page for Windows Phone.

Blend Designer Error Due to Service Locator

I was just about to get started using Blend to layout a page and noticed that there was an error in the Results pane in Blend, stating that the ServiceLocatorProvider must be set.

image

I was pretty certain that this was something I was doing during startup but clearly that code isn’t being run correctly at design time. Turns out I don’t really care at design time since I’m going to predominantly use design time data. This means that in the ViewModelLocator constructor I can simply exit if it’s being invoked at design time. Unfortunately the usual design mode property that you can query to determine if the code is being run at design time doesn’t exist in the view model locator scope. Luckily an alternative is to simply query if the service locator provider has been set:

public ViewModelLocator()
{
    if (!ServiceLocator.IsLocationProviderSet) return;
    DataService = ServiceLocator.Current.GetInstance<IDataService>();
    SyncService = ServiceLocator.Current.GetInstance<ISyncService>();
    NavigateService = ServiceLocator.Current.GetInstance<INavigateService>();
}

Synchronizing in a Background Task

Now that we have implementations for IDataService and ISyncService I can update the backgounrd task for the Windows platform applications to perform synchronization in the background. To begin with I need to reference the Autofac libraries (including the Microsoft common service locator and extensions libraries) by adding the RealEstateInspector.Background in NuGet package manager

image

 

The next thing is to update the Run method of the background task so that it looks for a current access token and uses it to initialise the data service.

public async void Run(IBackgroundTaskInstance taskInstance)
{
    try
    {
        var cost = BackgroundWorkCost.CurrentBackgroundWorkCost;

        var authContext = new AuthenticationContext(Constants.ADAuthority);
        if (authContext.TokenCache.ReadItems().Count() > 0)
        {
            authContext = new AuthenticationContext(authContext.TokenCache.ReadItems().First().Authority);
        }

        var authResult =
            await
                authContext.AcquireTokenSilentAsync(Constants.MobileServiceAppIdUri,
                Constants.ADNativeClientApplicationClientId);
        if (authResult != null && !string.IsNullOrWhiteSpace(authResult.AccessToken))
        {
            var dataService = ServiceLocator.Current.GetInstance<IDataService>();
            var syncService = ServiceLocator.Current.GetInstance<ISyncService>();

            await dataService.Initialize(authResult.AccessToken);

            if (cost == BackgroundWorkCostValue.High)
            {
                await syncService.ForceUpload();
            }
            else
            {
                await syncService.Synchronise(true);
            }
        }
    }
    catch (Exception ex)
    {
        Debug.WriteLine(ex.Message);
    }
    finally
    {
        if (deferral != null)
        {
            deferral.Complete();
        }
    }
}

Depending on the cost of the background task I either want the task to force an upload of pending updates (if high cost), or do a full synchronisation.

Navigation in the WPF Application Between View Models

In my previous post I showed adding a INavigateService to facilitate navigation between view models. This included an implementation of the service for Universal applications. For WPF the implementation looks very similar:

public class WPFNavigationService : CoreNavigateService<Page>
{
    protected override void NavigateToView(Type viewType)
    {
        (App.Current.MainWindow.Content as Frame).Navigate(new Uri("/Pages/" + viewType.Name + ".xaml", UriKind.RelativeOrAbsolute));
    }
}

Note that this assumes that pages are in the Pages folder of the project.

The other change that is required is that the WPF application needs to have a Frame which can be used to navigate between pages. So, the MainWindow now looks like

<Window x:Class="RealEstateInspector.Desktop.MainWindow"
        xmlns="
http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        Title="MainWindow" Height="350" Width="525"
        >
    <Frame Source="/Pages/MainPage.xaml" />
</Window>

All the content that was in the MainWindow is now in MainPage. And the type registration at application startup is getting more complex:

public void ApplicationStartup()
{
    CoreApplication.Startup(builder =>
    {

        builder.RegisterType<SignalRFactory>().As<ISignalR>();
#if NETFX_CORE
        builder.RegisterType<UniversalUIContext>().As<IUIContext>();
        builder.RegisterType<WindowsPlatformNavigationService>().SingleInstance().As<INavigateService>();

#elif DESKTOP
        builder.RegisterType<WPFNavigationService>().SingleInstance().As<INavigateService>();
#endif
    });

#if NETFX_CORE
    var navService = ServiceLocator.Current.GetInstance<INavigateService>() as WindowsPlatformNavigationService;
#elif DESKTOP
    var navService = ServiceLocator.Current.GetInstance<INavigateService>() as WPFNavigationService;
#endif
#if NETFX_CORE || DESKTOP
    navService.Register<MainViewModel, MainPage>();
    navService.Register<SecondViewModel, SecondPage>();
#endif
}

This creates a navigation bar which allows the user to navigate back between pages:

image

Navigation Service for Cross Platform Page/View Navigation from View Model

I already have a solution for allowing the view model to jump back onto the UI thread using the UIContext. However, I currently don’t have a mechanism that will allow one view model initiate navigation to a new page/view. What I want is the ability for one view model to request navigation by specifying the destination view model. Of course, this needs to work cross platform. Let’s take a look at the basics of how this could work – there are essentially two strategies that most mvvm style navigation frameworks use: The first is by convention where the lookup for the destination page/view is based on the name of the target view model; the second is by defining a mapping between view models and the corresponding page/view. In this case I’m going to go with the latter – In the Core library I define a couple of interfaces and an abstract implementation:

public interface INavigateService
{
    void Navigate<TViewModel>() where TViewModel : IDataViewModel;
}

public interface INativeNavigateService<TView> : INavigateService
    where TView : class,new()
{
    void Register<TViewModel, TViewType>() where TViewType : TView;
}

public abstract class CoreNavigateService<TView> : INativeNavigateService<TView> where TView : class, new()
{
    private readonly IDictionary<Type, Type> viewDictionary = new Dictionary<Type, Type>();

    protected Type ViewType<TViewModel>()
    {
        Type viewType = null;
        viewDictionary.TryGetValue(typeof(TViewModel), out viewType);
        return viewType;
    }

    public void Register<TViewModel, TViewType>() where TViewType : TView
    {
        viewDictionary[typeof(TViewModel)] = typeof(TView);
    }

    public void Navigate<TViewModel>() where TViewModel : IDataViewModel
    {
        var navType = ViewType<TViewModel>();
        NavigateToView(navType);
    }

    protected abstract void NavigateToView(Type viewType);
}

Next, in the client projects, I define a class that inherits from CoreNavigateService and implements the NavigateToView method. Here is the Windows Platform implementation:

public class WindowsPlatformNavigationService : CoreNavigateService<Page>
{
    protected override void NavigateToView(Type viewType)
    {
        (Window.Current.Content as Frame).Navigate(viewType);
    }
}

The ApplicationStartup method now looks like:

public void ApplicationStartup()
{
    CoreApplication.Startup(builder =>
    {

        builder.RegisterType<SignalRFactory>().As<ISignalR>();
#if NETFX_CORE
        builder.RegisterType<UniversalUIContext>().As<IUIContext>();

        builder.RegisterType<WindowsPlatformNavigationService>().SingleInstance().As<INavigateService>();
#endif
    });

    var navService = ServiceLocator.Current.GetInstance<INavigateService>() as WindowsPlatformNavigationService;

#if NETFX_CORE
    navService.Register<MainViewModel,MainPage>();
    navService.Register<SecondViewModel,SecondPage>();
#endif
}

Both the IDataViewModel, BaseViewModel and ViewModelLocator need to be extended to include an INavigateService property called NavigateService. Now from within the view model, navigation can be invoked by calling NavigateService.Navigate<SecondViewModel>()

Refactoring ViewModelLocator with Autofac

After reviewing the way that I was constructing the IDataService and ISyncService implementations I figured that I wasn’t really leveraging Autofac very well. I realised that I could refactor the ViewModelLocator to at least look up the current implmentation, for example:

public ViewModelLocator()
{
    DataService= ServiceLocator.Current.GetInstance<IDataService>();
    SyncService = ServiceLocator.Current.GetInstance<ISyncService>();
}

Of course for this to work I have to register the types, which can be done in the ApplicationCore class since the implmentations of both interfaces are located in the Core library:

public class ApplicationCore
{
    public void Startup(Action<ContainerBuilder> dependencyBuilder)
    {
        var builder = new ContainerBuilder();

        builder.RegisterType<DataService>().SingleInstance().As<IDataService>();
        builder.RegisterType<SyncService>().SingleInstance().As<ISyncService>();

        dependencyBuilder(builder);

Note however, that whilst this is a bit of an improvement, the ViewModelLocator is still an example of a service locator, which is an anti-pattern (http://blog.ploeh.dk/2010/02/03/ServiceLocatorisanAnti-Pattern/). I’m yet to find a workable improvement that will still allow me to construct the ViewModelLocator in XAML.

Integration Synchronization Wrapper and Restructuring Application Services

So far all the Mobile Service operations, including holding the instance of the MobileServiceClient, has been done by the MainViewModel. Clearly as the application grows this is not a viable solution so we need some application services which can be used to hold the reference to the MobileServiceClient and to facilitate application logic such as data access and synchronisation. To this end I’m going to create two services, IDataService and ISyncService with their corresponding implementations as follows:

public interface IDataService
{
    IMobileServiceClient MobileService { get; }

    Task Initialize(string aadAccessToken);
}

public class DataService: IDataService
{
    private readonly MobileServiceClient mobileService = new MobileServiceClient(
        Constants.MobileServiceRootUri,
        "wpxaIplpeXtkn------QEBcg12",
        new MobileServiceHttpHandler()
        );

    public IMobileServiceClient MobileService
    {
        get { return mobileService; }
    }

    public async Task Initialize(string aadAccessToken)
    {
        var jobj = new JObject();
        jobj["access_token"] = aadAccessToken;
        var access = await MobileService.LoginAsync(MobileServiceAuthenticationProvider.WindowsAzureActiveDirectory, jobj);
        Debug.WriteLine(access != null);
        var data = new MobileServiceSQLiteStore("inspections.db");
        data.DefineTable<RealEstateProperty>();
        data.DefineTable<Inspection>();

        await MobileService.SyncContext.InitializeAsync(data, new CustomMobileServiceSyncHandler());

    }
}

The IDataService implementation holds the reference to the IMoblieServiceClient. This will need to be initialized by passing in the Azure Active Directory access token but there after the MobileService accessor can be used to access data directly through the IMobileServiceClient instance.

public interface ISyncService
{
    event EventHandler<DualParameterEventArgs<double, string>> Progress;
    Task Synchronise(bool waitForCompletion);
    Task ForceUpload();
}

public class SyncService: ISyncService
{
    [Flags]
    private enum SyncStages
    {
        None = 0,
        UploadChanges = 1,
        PullProperties = 2,
        PullInspections = 4,
        All = UploadChanges | PullProperties | PullInspections
    }

    public event EventHandler<DualParameterEventArgs<double,string>> Progress;

    public IDataService DataService { get; set; }

    private ISynchronizationContext<SyncStages> SynchronizationManager { get; set; }

    public SyncService(IDataService dataService)
    {
        DataService = dataService;
        SynchronizationManager = new SynchronizationContext<SyncStages>();
        SynchronizationManager.DefineSynchronizationStep(SyncStages.UploadChanges, UploadPendingLocalChanges);
        SynchronizationManager.DefineSynchronizationStep(SyncStages.PullProperties, DownloadChangesToRealEstateProperties);
        SynchronizationManager.DefineSynchronizationStep(SyncStages.PullInspections, DownloadChangesToInspections);
        SynchronizationManager.SynchronizationChanged += SynchronizationManager_SynchronizationProgressChanged;
    }

    public async Task Synchronise(bool waitForCompletion)
    {
        await SynchronizationManager.Synchronize(SyncStages.All, waitForSynchronizationToComplete: waitForCompletion);
    }

    public async Task ForceUpload()
    {
        await SynchronizationManager.Synchronize(SyncStages.UploadChanges, true, true);
    }

    private void SynchronizationManager_SynchronizationProgressChanged(object sender, SynchronizationEventArgs<SyncStages> e)
    {
        var message = e.ToString();
        if (Progress != null)
        {
            Progress(this,new object[]{ e.PercentageComplete, message});
        }
    }

    private async Task<bool> UploadPendingLocalChanges(ISynchronizationStage<SyncStages> stage)
    {
        await DataService.MobileService.SyncContext.PushAsync(stage.CancellationToken);
        return true;
    }
    private async Task<bool> DownloadChangesToRealEstateProperties(ISynchronizationStage<SyncStages> stage)
    {
        await DataService.MobileService.PullLatestAsync<RealEstateProperty>(stage.CancellationToken);
        return true;
    }
    private async Task<bool> DownloadChangesToInspections(ISynchronizationStage<SyncStages> stage)
    {
        await DataService.MobileService.PullLatestAsync<Inspection>(stage.CancellationToken);
        return true;
    }
}

The ISyncService defines the actual synchronization steps. Rather than simply exposing a generic Synchronize method that accepts the a SyncStages parameter to determine which steps are synchronized, the ISyncService actually exposes high level methods for performing a full synchronize (Synchronize) and just to upload pending changes (ForceUpload). Note that the former has a parameter indicating whether the method should wait synchronization completion before returning, whereas the latter will always wait for the upload part of the synchronize to complete.

To make these services available to the view models of the application the BaseViewModel has been updated to include properties for both services:

public class BaseViewModel : INotifyPropertyChanged
{
    public IDataService DataService { get; set; }
    public ISyncService SyncService { get; set; }

And of course the ViewModelLocator is updated to create instances of these services and assign them to the view model when they’re created:

public class ViewModelLocator
{
    public IDataService DataService { get; set; }
    public ISyncService SyncService { get; set; }

    public ViewModelLocator()
    {
        DataService=new DataService();
        SyncService=new SyncService(DataService);
    }

    public MainViewModel Main
    {
        get { return CreateViewModel<MainViewModel>(); }
    }

    private readonly Dictionary<Type, object> viewModels = new Dictionary<Type, object>();

    private T CreateViewModel<T>() where T:new()
    {
        var type = typeof (T);
        object existing;
        if (!viewModels.TryGetValue(type, out existing))
        {
            existing = new T();
            viewModels[type] = existing;
        }

        var baseVM = existing as BaseViewModel;
        if (baseVM != null)
        {
            baseVM.DataService = DataService;
            baseVM.SyncService = SyncService;
        }

        return (T)existing;
    }
}

Complex Synchronization Wrapper

One of the more complex tasks in building offline-enabled (aka occasionally connected/disconnected) software is how you handle synchronization. Most synchronization frameworks typically handle synchronization of one form of data. For example Mobile Services allow for synchronization of individual tables based on a query. However, for most application this isn’t sufficient – they may require database synchronization in addition to uploading of new images taken on the device and downloading any associated documents for offline viewing. This means you need a synchronization layer that can co-ordinate synchronization of different data types/formats/processes.

There may be times within the application where you don’t want to perform a full synchronization. For example if the user creates a new record, the application should attempt to push this new record to the server immediately but it may not want to do a full synchronization until the user hits the sync button.This means you need a mechanism where you can partition the synchronization layer and only trigger synchronization of individual parts as required.

Here’s an example of what the synchronization wrapper might look like in action:

[Flags]
public enum SyncStages
{
    None=0,
    Stage1=1,
    Stage2=2,
    Stage3=4,
    All = Stage1 | Stage2 | Stage3
}

public class TestSynchronization     {
    public SynchronizationContext<SyncStages> SynchronizationManager { get; set; }

    public Action<string> Progress { get; set; }

    public TestSynchronization()
    {
        SynchronizationManager = new SynchronizationContext<SyncStages>();
        SynchronizationManager.DefineSynchronizationStep(SyncStages.Stage1, Step1);
        SynchronizationManager.DefineSynchronizationStep(SyncStages.Stage2, Step2);
        SynchronizationManager.DefineSynchronizationStep(SyncStages.Stage3, Step3);
        SynchronizationManager.SynchronizationChanged += SynchronizationManager_SynchronizationProgressChanged;
    }

    private void SynchronizationManager_SynchronizationProgressChanged(object sender, SynchronizationEventArgs<SyncStages> e)
    {
        var message = e.ToString();
        Progress(message);
    }

    private const int Step1Stages = 5;

    public async Task<bool> Step1(ISynchronizationStage<SyncStages> step)
    {
        step.RegisterSubStagesCount(Step1Stages);
        for (int i = 0; i < Step1Stages; i++)
        {
            step.StartSubStage();
            await Task.Delay(1000, step.CancellationToken);
            step.EndSubStage();
            if (step.CancellationToken.IsCancellationRequested) return false;
        }
        return true;
    }

    public async Task<bool> Step2(ISynchronizationStage<SyncStages> step)
    {
        await Task.Delay(2*1000, step.CancellationToken);
        return true;
    }

    public async Task<bool> Step3(ISynchronizationStage<SyncStages> step)
    {
        step.RegisterSubStages(Step3Stages.S3Stage1, Step3Stages.S3Stage2, Step3Stages.S3Stage3);

        await step.RunSubStage(Step3Stages.S3Stage1, Step3Sub);

        await step.RunSubStage(Step3Stages.S3Stage2, Step3Sub);

    await model.SynchronizationManager.Synchronize(SyncStages.Stage1 | SyncStages.Stage2,
        waitForSynchronizationToComplete: true);
}

private async void FullSynchronizeClick(object sender, RoutedEventArgs e)
{
    await model.SynchronizationManager.Synchronize(SyncStages.All,
        cancelExistingSynchronization:true,
        waitForSynchronizationToComplete: true);

        return true;
    }

    private enum Step3Stages
    {
        S3Stage1,
        S3Stage2,
        S3Stage3
    }

    private async Task<bool> Step3Sub(ISynchronizationStage<Step3Stages> step)
    {
        step.Progress(0.2);
        await Task.Delay(2000);
        step.Progress(0.7);
        await Task.Delay(2000);
        step.Progress(1.0);
        return true;
    }

}

Triggering and/or cancelling synchronization can then be done using the following:

await model.SynchronizationManager.Synchronize(SyncStages.Stage1 | SyncStages.Stage2, waitForSynchronizationToComplete: true);
await model.SynchronizationManager.Synchronize(SyncStages.All,
                cancelExistingSynchronization:true,
                waitForSynchronizationToComplete: true);
await model.SynchronizationManager.Cancel(true);

The first line triggers stages 1 and 2 to be synchronized – it won’t cancel any existing synchronization process and will only return once it has completed the synchronization process; The second line triggers all stages to be synchronized and will cancel any existing synchronization process; The third line will cancel any existing synchronization and only return once they’ve been cancelled.

I’ve attached a first pass at an implementation of such a sync framework. Note that the actual sync logic is in the steps shown above, it’s the framework for scheduling them and reporting progress which is being shown in the sample.

SynchronizationWrapper.zip (48.4KB)