Thursday, 20 April 2017

Simple Git branching strategy for release cycles

Coming up with a branching strategy that works well can be challenging when working with multiple developers and managing release cycles.
A simple approach is presented here to manage release cycles, with a small to medium sized team of developers while still being able to react to production issues and fix bugs. The primary goal being to isolate work streams without impacting development progress.


Git does not enforce any particular strategy when it comes to branching which is partly what makes it such a great and flexible repository.
The problems start to arise though as you move into different stages of your development process. As an example, you have a release almost complete but don’t want to impede progress on the upcoming release cycle which is where the majority of effort is required.

The Basic Approach

The focus is around producing a release while still being able to react to hotfixes or production issues without impacting on going development of features.
The branches we can create to produce this workflow are shown below:


Branching Workflow

As highlighted, the solution revolves around branch management, creating the right set of branches to make the process work.


The master branch is used purely for releases and is only merged into from a release branch. The master branch stores the official release history. Version tags should be added to the commits into the master branch.


The development branch serves as an integration branch for features. It is initially branched from master. 


Once enough features have been accumulated or a release deadline is approaching, the release branch is forked from the development branch. The release is now feature-frozen. Any features still in development are postponed for the next release cycle. The release version number is established and release related commits can continue to be added along with bug fixes and documentation. The next release cycle can now continue without impacting the current release. A meaningful name should be used which includes 'release' and a short description.
Example: release-your-release-title


feature branch is branched from the development branch. A meaningful name should be used which may include an issue identifier used from your work item tracking system i.e. JIRA and a basic description. 
Example: <Jira Identifier>-your-feature-short-description
Features are only merged into the development branch after a pull request has been signed off or a code review has been performed and signed off. 
Feature branches should never interact directly with master.


hotfix branch is used to patch a production release. As soon as the fix is complete, it should be merged into development and master. A meaningful name should be used and a basic description.
Example: hotfix-your-hotfix-short-description


The following can be used as a checklist before committing or merging
  • Features should never interact directly with master
  • Developers work locally and push branches to the central repository
  • development branch is used as the source branch for feature branches
  • Feature branches are used for all individual features being developed
  • Commits are performed daily into feature branches and pushed to the central repo
  • Features are only merged into development after a code / review has been completed


Git is a really powerful tool and this approach is really a discipline that requires adoption by each member of the team. This approach is only a guide and hopefully you can take something away and adapt it to suit your particular workflow.

Tuesday, 14 March 2017

Azure Portal - Web-based code editor for App Services

A feature that caught my eye is the App Service Editor, a preview addition to the Development Tools section of the navigation bar of your app in the Azure Portal.

In essence it is a web-based editor for App Service. That's right, you can edit and save your web site LIVE with a clean, easy to use, web editor!

We have already come to love the Visual Studio Team Services code editor, which allows code and config edits pre-deployment but this feature is production level, post deploy / release. Cannot stress how useful this is for quick fixes and post deployment edits which would otherwise mean re-deploying or falling back to FTP. Project Kudu has been around for a while now but great to see it appearing as a first class citizen in the portal. The underlying functionality is provided by the Monaco Editor, which powers Visual Studio Code.

It is worth noting that if Continuous Integration (CI) has been enabled for your application, that the next build will overwrite your changes. But that is what you would expect, any changes made using the editor would need to be fed into your code repository / build workflow.

Code Editor

To access the editor, simply type 'App Service' or go to App Service Editor (Preview) in the Development Tools section of the left-hand navigation of your app. (see below)

After selecting 'GO' a nice tree view of your sites assets is visible along with the editor and options to view split pane and changes.

Search Features

There are a number of a really good search features, below are the results of my query text 'sitemap'. As toy can see it not only found the set of sitemap files, it also discovered the robots.txt.

There is also a nice 'Go to File...' button which give a context based search dialog

Code Repository Integration

Another great feature of the editor is the ability to hook up your repo from either a GIT source or Visual Studio Online. Then you can benefit from all the goodness of having the repository commands right from within the editor. 

Command Console

Backing up the GUI and driving the underlying command experience is the command console.

Typing 'help' in the console gives a list of available commands.

There is a lot that can be performed with that set of commands. Expect more commands to appear as other services are associated with your application.


To support the commands and output of your application is the output window. It provides an ongoing set of information, warnings and errors, based on the logging level set. You will also see custom log output from your application.


The benefits of this new feature are clear for all to see and can mean the difference to being able to react immediately to certain scenarios or not. Obviously with this power does come responsibility and more than the usual care is required before committing changes with wide reaching impact. But if used conservatively, under the right circumstances and by those with the domain knowledge to understand the impact, it is a superb tool to have in your back pocket.

Available to discuss your next project.

Monday, 27 February 2017

Test connection to remote SQL Server Database from an Application server

While aiming to test whether a SQL connection is succeeding between an Application server and a remote Database Server, it is often not possible to install SQL Server Management Studio (SSMS) or Microsoft Command Line Utilities (MsSqlCmdLnUtils) due to the locked down nature of the targets, particularly in test and production environments.

A lightweight approach that worked for me recently, makes use of components that have been a part of windows boxes for a long time, albeit different levels of database driver support as the components have evolved, the Microsoft Data Access Components (MDAC).

MDAC provide a Universal Data Link, which can be configured using a common user interface for specifying connection properties as well as testing the connection.

 Data Link properties dialog box

Get started – Create a simple text file

Simply create a .txt file anywhere on your system

  • rename the extension to .udl

Double click the Universal Data Link file (.udl)

We are then presented with the Data Link Properties dialog, which will look very familiar.

Using either Windows Integrated Security or a specific Username / Password combination the connection to the database can be tested.

You can also save your configuration settings you have applied by just selecting ‘OK’ on the properties dialog, useful for reoccurring scenarios, more complex configurations or just as a handy utility for future reference.

Using this approach, i was able to determine very quickly that the password for a test environment database had been changed in just a few minutes.


Feel free to contact me via my site AssemblySoft to discuss any ways i can help with your next project...

Monday, 13 February 2017

Debugging a Batch Script, managed by the Windows Task Scheduler (Part 1)

I was contacted recently with an interesting problem that had arisen for a client where a batch script was not working as expected and silently failing in a production environment. It was being run by the Windows Task Scheduler which was reporting a '1' return code from the Task:Action history. As the script was configured to run silently there was no easy way to see what was going wrong. The Event Log did not contain anything useful and unfortunately the batch script didn't contain any logging statements. There was also a strong desire not to install any remote tooling or alter the code as it had been working previously and was currently servicing.

I thought i would share the steps taken to identify and fix the issue as it is not uncommon to be faced with a black box, which seemingly doesn't offer much at first glance, but with a little investigation, the lights can soon be turned back on.

Understanding the Stack

After some investigation it became clear that apart from 'always getting given the fun stuff' the batch script was calling python scripts, via the python runtime, which in turn called out to .Net components via Iron Python.

The basic setup of the scheduler with the task running the scripts can be seen in the screenshot below.

Looking Forward

Have put this article together in two parts. This part 1 focuses on obtaining the errors from the batch script. The second part looks at extracting the errors from the python and .Net layers.

The Solution

Something in the stack was failing but it was unclear what.

The first thing i proceeded to do was make a backup of the existing task by performing an export. This ensured we had all the properties and settings stored and provided a way back should it be required. I then used the exported properties to create a new task, essentially a copy that could be used to debug. Every care would now be required to ensure that any additions were on a read only basis in terms of affecting flow and data. After disabling the failing task it was time to get some output from the script by adding some Echo statements.
The Task Scheduler provides an 'Add Arguments' text box where i attempted to pipe the output to a text file using > c:\logs\dailyTasks.log

This gave me an initial win where i could see my newly entered echo statements from the batch file but no actual errors. 

After some digging i added 2>&1 to the end of the argument list and voila, an error appeared in my log. Unfortunately the error was still rather cryptic and not very meaningful.

I also added double pipe (>>) to 'append' rather than 'overwrite' which proved to be useful to see progress.

So the arguments to the batch file now looked as follows:
>> c:\logs\dailyTasks.log 2>&1

For those in a more traditional setup, this article will hopefully add a little help to getting the actual errors being reported from a batch script, being run and managed  by the Windows  Task Scheduler. This journey still has a way to go and for those interested in the Python and .Net integration, please read on in Part 2.


Feel free to contact me via my site AssemblySoft to discuss any ways i can help with your next project...

Friday, 3 February 2017

ASP.Net Core 1.1 DOS Vulnerability

January 2017 Update for ASP.NET Core 1.1

Yesterday, Microsoft released an update for ASP.NET Core 1.1 due to Microsoft Security Advisory 4010983. The advisory is for a vulnerability in ASP.NET Core MVC 1.1.0 that could allow denial of service. 

Affected Software

The vulnerability affects any Microsoft ASP.NET Core project if it uses the following affected package version.
Affected package and version
Package name
Package version

Advisory FAQ

How do I know if I am affected?
ASP.NET Core has two different types of dependencies, direct and transitive. If your project has a direct or transitive dependency on Microsoft.AspNetCore.Mvc.Core version 1.1.0 you are affected.
Full details of the advisory can be found here:
Further details on how to obtain the update and instructions for install can be found on the .Net Core Blog:
Although we are so excited about cross platform development with our favourite tooling and embracing .Net Core, it keeps us mindful that we are still in the early stages of the journey and should consider carefully when choosing whether now is the right time to embark on a full blown production adoption for enterprise wide solutions.

Tuesday, 23 September 2014

Windows Azure Storage Emulator failed to install

Windows Azure Storage Emulator failed to install

When attempting to install a new version of the Azure Storage Emulator either as a separate installation package or automatically as part of an Azure SDK update, you may run into an error message which states the storage emulator has failed to install. This can occur using the Web Platform Installer (WebPI), NuGet Package Manager or when performing the install manually.

Below is the message received using the WebPI.


Storage Emulator Background  (optional reading)

The windows azure storage emulator executable lives under the Microsoft SDKs directory as shown below:


If we take a quick look inside the WAStorageEmulator.exe.config file we can see each of the storage services pointing to local service endpoints.

      <service name="Blob" url=""/>
      <service name="Queue" url=""/>
      <service name="Table" url=""/>


By default, the storage emulator uses Sql Server Express with the LocalDB execution mode to store it's data for each of the storage services.

This configuration can be seen by looking at the DevelopmentStorage config file

<?xml version="1.0"?>
<DevelopmentStorage xmlns:xsd="" xmlns:xsi="" version="2009-03-18">

which can be found at at the location shown below:

When the development storage emulator starts for the first time it creates the v11.0 database in LocalDB. This can be verified from the command line as shown below:

If you want to choose another instance of SQL instead of LocalDB, you would need to change the <SQLInstance>**</SQLInstance> property accordingly as shown in the configuration snippet above.

The database files for the storage emulator are located in the logged in user root directory

The format of the database file name is WAStorageEmulatorDb<Version of storage SDK>.mdf
So the version of the azure storage emulator currently installed is 3.3 as shown above.

There will be a database per version of the SDK installed in this directory.

SQL Server Express 'LocalDB' Background (optional reading) 

Microsoft SQL Server express allows you to take advantage of the same powerful database engine in a version tailored for redistribution and embedding. SQL Server Express includes 10GB of storage per database, easy backup and restore functionality, and compatibility with all editions of SQL Server and Microsoft Azure SQL Database.

LocalDB is a special execution mode of SQL Server Express which runs under the users security context, targeted at developers.  

SQL Server Express LocalDB instances are managed by using the SqlLocalDB.exe utility. LocalDB can be used to work with SQL Server databases. System database files for a database are stored in the user's local AppData folder.


Inspecting the install logs gives us an indication although subtle as to what is going wrong with the install. (cut down for brevity)

  1. === Logging started: 06-Apr-14  0:02:16 ===
  2. Action start 0:02:16: INSTALL.
  3. Action start 0:02:16: AppSearch.
  4. Action ended 0:02:16: AppSearch. Return value 1.
  5. ...
  6. Action start 0:02:16: InstallFinalize.
  7. CAQuietExec:  Windows Azure Storage Emulator command line tool
  8. CAQuietExec:  Error: No available SQL Instance was found.
  9. CAQuietExec:  Error 0xfffffff6: Command line returned an error.
  10. CAQuietExec:  Error 0xfffffff6: CAQuietExec Failed
  11. CustomAction RunInitialize returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox)
  12. Action ended 0:02:16: InstallFinalize. Return value 3.
  13. Action ended 0:02:17: INSTALL. Return value 3.
  14. Property(S): UpgradeCode = {CF5CD495-AEDE-42DA-B7CF-A70D398D4E6A}
  15. Property(S): RunInitialize = "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator\WAStorageEmulator.exe" init -forcecreate -autodetect
  16. Property(S): DOTNET4FULL = 4.5.51641
  17. Property(S): LOCALDBINSTALLED = C:\Program Files (x86)\Microsoft SQL Server\110\LocalDB\Binn\SqlUserInstance.dll
  18. Property(S): SQLEXPRESSVERSION = 11.0.2100.60
  19. Property(S): TARGETDIR = J:\
  20. Property(S): StorageEmulatorMenuFolder = C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Windows Azure\Storage Emulator\
  21. Property(S): STORAGEEMUDIR = C:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator\
  22. Property(S): WixUIRMOption = UseRM
  23. Property(S): ALLUSERS = 1
  24. Property(S): ARPNOMODIFY = 1
  25. Property(S): REINSTALLMODE = amus
  26. Property(S): WindowsAzureMenuFolder = C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Windows Azure\
  27. Property(S): ProgramMenuFolder = C:\ProgramData\Microsoft\Windows\Start Menu\Programs\
  28. Property(S): WINDOWSAZUREDIR = C:\Program Files (x86)\Microsoft SDKs\Windows Azure\
  29. Property(S): MICROSOFTSDKSDIR = C:\Program Files (x86)\Microsoft SDKs\
  30. Property(S): ProgramFilesFolder = C:\Program Files (x86)\
  31. ...
  32. Property(S): ProductToBeRegistered = 1
  33. MSI (s) (6C:2C) [00:02:17:017]: Product: Windows Azure Storage Emulator - v3.0 -- Installation failed.
  34. MSI (s) (6C:2C) [00:02:17:017]: Windows Installer installed the product. Product Name: Windows Azure Storage Emulator - v3.0. Product Version: 3.0.6848.39. Product Language: 1033. Manufacturer: Microsoft Corporation. Installation success or error status: 1603.
  35. === Logging stopped: 06-Apr-14  0:02:17 === 

The error message at first may appear confusing as you most likely do have an available SQL instance. However digging a little deeper into what the installer is attempting to do will help resolve the issue.

Looking back at the error log may have left you confused as you may well have verified that a LocalDB instance does indeed exist.

Taking a look at the 'application' event log however reveals more detail.

Log Name:      Application
Source:        SQLLocalDB 11.0
Event ID:      267
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
LocalDB instance is corrupted. See the Windows Application event log for error details.
Event Xml:
<Event xmlns="">
    <Provider Name="SQLLocalDB 11.0" />
    <EventID Qualifiers="35269">267</EventID>
    <TimeCreated SystemTime="2014-05-05T16:44:32.000000000Z" />
    <Security />

Log Name:      Application
Source:        SQLLocalDB 11.0
Date:          AM
Event ID:      261
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Cannot access LocalDB instance folder: %%LOCALAPPDATA%%\Microsoft\Microsoft SQL Server Local DB\Instances\<instance name>.
Event Xml:
<Event xmlns="">
    <Provider Name="SQLLocalDB 11.0" />
    <EventID Qualifiers="35269">261</EventID>
    <TimeCreated SystemTime="2014-05-05T16:44:32.000000000Z" />
    <Security />

Here we see a more granular level of explanation, informing that the LocalDB instance is corrupt.

When we upgrade versions of the Storage Emulator it is most likely that the database has undergone some new enhancements, as a result we need to ensure the database gets updated regardless of it's prior existence.

The Storage Emulator database, we mentioned earlier, only gets created on first run of the storage emulator if it doesn't already exist. If we look more closely at the command line tool switches available for the 'Init' command we observe the -forcreate option. This forces the creation of the SQL database, even if it already exists. 

As our error is at the instance level, the storage emulator command line tool using the -forcreate option won't buy us anything. However if our error message was related to the actual database it would certainly come into play.

In this case we can go direct to the database instance, stop it, delete it and re-create it, including the removal of any storage emulator databases that may have been partially created along the way.

Firstly stop the LocalDB instance from the command line by issuing the stop command as shown below:

Then proceed to issue the  delete command: sqllocaldb delete v11.0 

It is possible that the WAStorageEmulatorDb* database has been created from a previous attempt of the current upgrade you are trying to undertake. In this case at this point issues a command to delete the storage emulator database from the location shown earlier:
delete C:\Users\<Your user account>\WAStorageEmulatorDb*.* 

Proceed to re-create the LocalDB instance by issuing the following command: sqllocaldb create v11.0
After performing the steps outlined, we can re-run the upgrade and get the result we expected first time round.

Quick Reference

  • WAStorageEmulator [/start] [/stop] [/status] [/clear] [/init] [/help]

Useful Links


When experiencing issues upgrading the storage emulator, the install logs, event log and some understanding of what the storage emulator relies on to run, will help to problem solve installation issues. In our case it boiled down to a corrupt LocalDB instance which needed a nudge to re create.

We come to expect our upgrade experience to work almost flawlessly these days but it is always worth bearing in mind that an install is just a set of code steps written by someone like you or me using libraries with potential bugs just waiting to crop up, even with Test Driven Development (TDD), Behavior Driven Development (BDD), Integration and smarter software release cycles.