DVLA Vehicle Enquiry Service API

I’ve recently been developing a application for the car industry, to record vehicles, models, customer information, vehicle sales and storage of images etc. This is ASP.NET Blazor application which utilises Entity Framework Core, Microsoft Azure, Azure Web sites and Azure storage. One of the features I thought would really be interesting is for the car sales administrators to enter new vehicles into their stock and then dynamically have a component which allows them to query the vehicle information via the DVLA VES Web API.

Sometimes, to start with developing a component, I develop the component code outside of Blazor i.e. a Console application which can contains all the response structure, associated methods and calls the VES API with a HTTP client.

Now that I am finished with the basics of calling the VES web API, I can now expand this with all the necessary error handling e.g. HTTP error codes, and include this as a component within the Blazor application.

I’ve published the ASP.NET console application in my GitHub repo in the link below. You will need to request a VES API key and utilise your registrations, or set these as arguments in the console app, or just set a static string for this as it is in the current code.


If you prefer to perform a simpler test, I’ve provided a working PowerShell script below, you will just need to utilise your own API Key and set the registration as required.

$VehicleRegistrationNumber = "TE57VRN"
$APIKey = "[APIKEY]"
$DVLAURI = "https://driver-vehicle-licensing.api.gov.uk/vehicle-enquiry/v1/vehicles"
$Header = @{"x-api-key" = $APIKey} 
$RequestBody = @{"registrationNumber"= $VehicleRegistrationNumber} | ConvertTo-Json

Invoke-RestMethod -Method Post -Uri $DVLAURI -Body $RequestBody -ContentType "application/json" -Headers $Header

Azure VM Managed Disks backup to Azure Blob Storage

I’ve recently had a requirement to backup managed disks to Azure blob storage. Rather performing this manually, I scripted this with PowerShell. I created a structure within a .CSV file as an input file, which specifies the Virtual machine names and the storage container and folder, where the disks will be copied into. The script and a sample input file is in my repository below.

Copy Azure Virtual Machine Managed Disks to Azure Blob Storage PowerShell Script

Sample Input .CSV File

You will need to update the following variables in the PowerShell script and create your own input file.

$VMNames = Import-Csv -LiteralPath “FULL-PATH-TO-CSV-FILE”
$TargetStorageAccountName = “YOUR-STORAGE-ACCOUNT-NAME”
$TargetStorageAccountKey = “YOUR-STORAGE-ACCOUNT-KEY”

Cosmos DB World Blazor Application with DevExpress UI Components and Bing Maps: Part 2

A few months ago I developed and published a post on a Blazor server project which utilises DevExpress UI Framwork components, Cosmos DB and Bing Maps with JSinterop. There are two Razor UI components, which are presented on the main index page, these display country and city selector drop down controls, a tiny javascript file to interact with the Bing Maps API and a Bing maps UI component.

To use this solution, you will need to provision a Cosmos DB account and create a single collection. You can utilise the Cosmos DB migration tool to load the dataset below, into the Cosmos DB collection. Various appsettings need to be updated to reflect your own Cosmos DB account, collection, partition key, database and cosmos db key. The _host.cshtml, will need to be updated for the Javascript link reference in relation to the Bing maps key too.

The project site is accessible on Microsoft Azure below and I have now uploaded the code to my GitHub public repo. Follow the links below.

Cosmos DB World Web Site

Cosmos DB World Source Code

Cosmos DB World Countries DataSet

My Public Code Repository on GitHub

For a while now, I have been adding code inline within my blob posts. I’ve now released a public repository on GitHub, which I will be using moving forward to publish all my community shared code, free for use with the MIT license.

I will start to publish my previous code into the public repository, for now I’ve published a recent .NET 5 console app which is a basic RSS reader that currently reads the .NET teams RSS feed. My public repository can be found below.

Tejinder Rai on GitHub

Blazor File Uploads – to Azure Blob Storage

The Intro…

You might wonder, your an Architect, why are you posting blogs about coding? Good question I would say 🙂 Well, working as an Architect is my day job, but there are many different ideas I have in my head and some of them just require a bit more thought, where I might need to build a PoC, or sometimes like this post, I like to try out new frameworks, libraries or just build samples for fun. Having already coded for well over a decade in .NET, doesn’t stop me from wanting to keep up to date with my coding skills , especially C#, or trying out different architectural patterns to understand how they can be applied in software applications, hybrid architectures, native cloud architectures or help a project I am working on at a specific point in time. From a work perspective, all things Microsoft are highly important to me, especially since I consider myself lucky enough to have spent a fair bit of time working at Microsoft Canada, Microsoft UK and in Seattle at Microsoft HQ and at Microsoft conferences being trained and kept up to date on technologies, which were not always about Microsoft Azure of course.

Now back to this post…

With the recent updates announced at the .NET 2020 Conference last week, I decided to try out the new InputFile UI component in Blazor. Steve Sanderson had already posted a blog regarding a InputFile component here previously, in September 2019. This is now included with the launch of .NET 5. I’ve also been using the DevExpress Blazor UI Components library and used the <DXUpload/> UI component, which I highly recommend also.

In this post I wanted to try out the new native InputFile Control and upload some files to Azure blob storage for testing purposes. I am working on a project which will require the basic functionality of file uploads and downloads to and from Azure blob storage, I may add a future post about one of the new exciting projects I am working on.

Note: This is a very basic implementation for the purposes of this blog post, it is not at all production ready.

Nuget Packages

With any .NET project, the first thing you may want to do is to decide your approach on how you want to build out your components, whether or not you have existing shared libraries, component libraries, NuGet packages you work with regularly, or if you just like coding and building your own supporting toolsets and templates, you may not even need to include anything from the onset and just include packages whilst you are building out your solution in an agile way.

In my sample project, a Blazor Server application, called BlazorFileUploader, I used the following NuGet packages, Microsoft Azure.Storage.Blobs client library and DevExpress.Blazor.

Note: As I mentioned earlier in this article, I have been testing the DevExpress UI components <DXUpload> UI component, but it is not used in this post.

The code…


The following configuration was added for the Azure blob storage connection string. Of course, in an enterprise scenario, if your hosting your application in Azure App Service, you could utilise Azure KeyVault, which is recommended instead.

“Storage”: {
“StorageAccountConnectionString”: “DefaultEndpointsProtocol=https;AccountName=[YourStorageAccountName;AccountKey=YourStorageAccountKey==;EndpointSuffix=core.windows.net”,


The the default configuration provider, the connection string is set in string within a config context class.

public class Startup
    public IConfiguration Configuration { get; }
    readonly string MyAllowSpecificOrigins = "_myAllowSpecificOrigins";
    public Startup(IConfiguration configuration)
        Configuration = configuration;
        Config.AzureStorageConnectionString = Configuration.GetConnectionString("StorageAccountConnectionString");


The AzureStorageConnectionString is set by the code above, so this can be used when the server upload is handled server side.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection.Metadata.Ecma335;
using System.Threading.Tasks;

namespace BlazorAzureStorageFileLoader.Data
    public class Config
        public static string AzureStorageConnectionString { get; set; }


New Razor page

A new Razor page is created to handle the file uploads, a bulk of the code, with the <InputFile/> component.

Note: I hard coded a container named “files” in the code. This is not necessary of course, you can build your own blob storage browser and even have methods to retrieve files and create containers.

@page "/fileloader"
@using System.IO
@using Azure.Storage.Blobs

<h4>Blob Storage File Loader</h4>

<InputFile OnChange="@UploadFiletoAzBlobStorage" />


@if (fileSelected)
    <div class="spinner-border" /><h5>Uploading file...</h5>

@code {

    int count = 1;
    string status;
    bool fileSelected = false;
    private string localFileName { get; set; } = "Not Selected";

    private void UpdateStatus(ChangeEventArgs e)
        status = e.Value.ToString();
        count = count +1 ;

    async Task UploadFiletoAzBlobStorage(InputFileChangeEventArgs e)
        var file = e.File;

        if (file != null)

            fileSelected = true;

            string connectionString = Data.Config.AzureStorageConnectionString;

            // Nax File Size ~ 50MB
            long maxFileSize = 1024 * 1024 * 50;

            BlobContainerClient container = new BlobContainerClient(connectionString, "files");

                BlobClient blob = container.GetBlobClient(file.Name);
                using (Stream fs = file.OpenReadStream(maxFileSize))
                    await blob.UploadAsync(fs);
            catch (Exception ex)

                // Clean up after the test when we're finished

            status = $"Finished loading {file.Size/1024/1024} MB from {file.Name}";
            fileSelected = false;
            status = $"No file selected!";


The navigation menu is updated to remove the default links and an update is made to the application display name. A new navigation link is added to the fileloader razor page.

div class="top-row pl-4 navbar navbar-dark">
    <a class="navbar-brand" href="">Azure Storage FileLoader</a>
    <button class="navbar-toggler" @onclick="ToggleNavMenu">
        <span class="navbar-toggler-icon"></span>

<div class="@NavMenuCssClass" @onclick="ToggleNavMenu">
    <ul class="nav flex-column">
        <li class="nav-item px-3">
            <NavLink class="nav-link" href="" Match="NavLinkMatch.All">
                <span class="oi oi-home" aria-hidden="true"></span> Home
        @*<li class="nav-item px-3">
            <NavLink class="nav-link" href="counter">
                <span class="oi oi-plus" aria-hidden="true"></span> Counter
        <li class="nav-item px-3">
            <NavLink class="nav-link" href="fetchdata">
                <span class="oi oi-list-rich" aria-hidden="true"></span> Fetch data
        <li class="nav-item px-3">
            <NavLink class="nav-link" href="fileloader">
                <span class="oi oi-list-rich" aria-hidden="true"></span> File Loader
        @*<li class="nav-item px-3">
            <NavLink class="nav-link" href="fileloader2">
                <span class="oi oi-list-rich" aria-hidden="true"></span> File Loader 2
        @*<li class="nav-item px-3">
            <NavLink class="nav-link" href="Component Testing">
                <span class="oi oi-list-rich" aria-hidden="true"></span> File Loader 2

@code {
    private bool collapseNavMenu = true;

    private string NavMenuCssClass => collapseNavMenu ? "collapse" : null;

    private void ToggleNavMenu()
        collapseNavMenu = !collapseNavMenu;

The Test

The FileInput UI component is shown in the fileloader.razor page.

I choose a local file to upload.

The spinner is shown with the filename. The code which displays this in the razor page is highlighted below.


@if (fileSelected)
    <div class="spinner-border" /><h5>Uploading file...</h5>

The file load to Azure blob storage is completed.

The blob is shown in the Azure storage account blob container.

That’s the most simple and quickest way to utilise the new <InputFile/> UI component in Blazor with .NET 5, with Azure blob storage.

.NET Core 3.1 to .NET 5.0

Today marks the launch of .NET 5.0, the unified platform framework which executes on desktop, web, cloud, mobile, gaming, IoT and AI platforms. Catch all the updates in the sessions at the .NET conference from November 10th 2020, through to November 12 2020, here. Ensure that you keep an eye on the support policy too, which is available here.

And more news, .NET 5.0 is available in Azure App Service, from today!

Useful Links:

• .NET Documentation https://aka.ms/dotnet-docs
• SDK Documentation https://aka.ms/dotnet-sdk-docs
• Release Notes https://aka.ms/dotnet5-release-notes
• Tutorials https://aka.ms/dotnet-tutorials
• Code Mag https://aka.ms/dotnet5-codemag

In addition, I’ve ensured Visual Studio 2019 is on the latest release, 16.8.0.

Cosmos DB World Blazor Application with DevExpress UI Components and Bing Maps:Part 1

I’ve recently been working on a Blazor server side application, to integrate with Cosmos DB. The Cosmos DB holds a single collection with a list of countries and cities with coordinates for each city. I’ve had the small dataset for a while now, most of the free sources of data can be downloaded from this location: http://www.geonames.org/.

Initially the data was in .CSV format and I initially imported the dataset into an Azure SQL database. I later decided to copy the dataset into Cosmos DB and utilise the SDK within a Blazor application, added my Blazor DevExpress nuget package, for the drop down controls, then integrated a Bing Maps application with the Bing Maps SDK. Transferring the data from Azure SQL to a Cosmos DB collection was straight forward using Azure Data Factory.

I have published the basic version of the Blazor application to https://cosmosdbworld.azurewebsites.net, where you can select a country and then a city, using the DevExpress drop down controls, then the map is updated dynamically as the IJSRuntime interface is dependency injected into a Bing Maps UI component. Overall, the Bing map renders extremely quickly, as the longitude and latitude is bound to the parameters of the Bing maps component from the parent country and city drop down component. The longitude and latitude coordinates are also fetched from the CosmosDB collection, everytime a city is selected, which refreshes the map automatically. The simple javascript functions update the <div> in the UI component, with the coordinates each time the city is changed.

AADSTS Error Codes and References

I was working on an issue which seemed to be a conditional access policy issue in Azure AD today. Whilst the issue was related to a conditional access policy in the sign in logs, it actually turned out to be a user requiring SSPR security information verification. During research, I found that new Azure AD tenants from August 15th 2020, start utilising the combined experience for both MFA and SSPR. This is documented here.

Here’s a view of what was occurring, which may help someone else in the future.


A cloud application was excluded from MFA, the user also has a password policy applied to the Azure AD account not requiring password resets (something which was required for the scenario). When the user signed into the cloud application, the user received a prompt for more security information, seeming as though from the sign in logs this was point to a conditional access policy blocking the request.

Actual Issue

The actual issue was the user needed to review their verification methods to keep the account secure, for SSPR. This is controlled through the password reset options. Upon review of the sign in logs, an AADSTS50125 was present confirming the issue.

AADSTS Error Code Reference

The actual AAD Security Token Service error message are well documented on this Microsoft docs page. You can also submit the code to the web page below:


Light Bulb Moment {}

Since the above form accepts an error action with a code, as per the HTML markup:

I decided to create my own Blazor application to submit my own AADSTS error codes for troubleshooting. I used the HtmlAgilityPack to pull out the response in the Blazor component to display the details of error codes. This makes it easier to parse the response html and embed the response into a razor page with a reference to ((MarkupString)[HTMLValue])

I have deployed the Blazor AADErrorChecker SPA here, using Azure App service.

Azure AD Outage Sept 28, 2020; Tracking ID SM79-F88

Microsoft Azure suffered an outage with Azure AD authentication (Americas) on Monday, September 28th 2020, due to a software change in the service, which Microsoft reported was rolled back. As with other service health issues in cloud services, Microsoft constantly ensure customers are kept up to date with service issues and remediation activities. Azure Status, is main place to visit when reviewing cloud service health across all regions. For any historical status issues, the status history is also available for view.

I’ve seen a number of articles on several websites clearly reporting the issues incorrectly and making wide assumptions on how Microsoft handles software code updates. But, it’s important to understand a few basic things before making wide assumptions…

Lets talk about outages…

This isn’t the first and last outage we will see in any cloud platform.

Azure AD has an SLA, it’s important to understand what the SLA is and why it is stated; don’t expect every cloud service to be up 100% all the time. This happened in traditional data centers, there are things that happen, the things that cause outages are investigated, they are remediated, lessons are learned and the affected service, or process which affected the service, has an opportunity to be improved as part of continual service improvement phases.

Failures occur in every industry

Unfortunately, there are failures and service outages across different industries that affect consumers on a daily basis. We can always remember things like:

The time power was lost in our houses

The time the car had a fault or a recall

The time when the internet connection dropped

How does this help?

Well, technically for the outage it doesn’t, but the one of the key takeaways from a service outage is the remediation plan. The postmortem which Microsoft provide as part of the response to the issue experienced is key to improving the service availability and governance around how the specific outage occurred can be rectified to stop similar issues occurring again.

How does this help you?

As I mentioned earlier, it can be frustrating when outages occur, but in the long run, you can expect that similar outages should not occur in the way that the previous outage manifested itself previously. You can be sure that every outage is taken seriously, remediated and you can only expect that there will be innovative phases and releases to improve upon the service in the near future, to ensure outages are kept to a minimum. I can clearly imagine that publicly, we only see a very small percentage of the communication that occurs with such incidents. To Microsoft, such incidents are a matter of paramount importance.

Whilst, customers can’t specifically design for an Azure AD outage, where many Microsoft services rely on Azure AD, one of the key points of cloud computing is to not expect everything to be up and running 100% all of the time. That said, every enterprise solution deployed to a public cloud platform should be designed for availability, with redundancy in mind utilising the correct architectural patterns for each service component.

Closing thoughts

This has all reminded me of a quote by Tom Peters;

“Almost all quality improvement comes via simplification of design, manufacturing… layout, processes, and procedures.”
Tom Peters

Microsoft officially announce support end dates for ADAL and the Azure AD Graph API

I still have some old projects I’ve worked on in my own personal code repository, which still utilise ADAL and the Azure AD Graph API. Without question, there most definitely software solutions on the market which still use the the same nuget packages or endpoints via REST API’s to authenticate users and pull back/update user profile information in a multitude of ways as required by the application.

Just incase you missed this, Microsoft have now provided official guidance on the end of support timelines for both the Active Directory Authentication Library and the Azure Active Directory Graph API.

I’ve been using the Microsoft Authentication Library and MS Graph for a few years now, as the guidance from Microsoft has always been to urge developers to use MSAL and MS Graph for all new development projects. So, now it’s official and time to move over and start migrating over and work with the Microsoft Identity Platform v2.0 endpoint and v2.0 tokens.