Continuous Integration with Azure from TFS Build

Recently I have been doing some TFS work.  One of the tasks I have completed was to implement automatic publishing to Azure from TFS build.  Our goal is to automate as much as possible to make it easy for QA to test as well as ensure that every night we have a fresh build for them ready.  In addition to that there are a few side benefits of this approach that I really like.  It is very much in line with agile methodology, ensuring that we have a shippable version of the software as often as possible.  In addition to that it is nice to test the final product in eventual production environment as often as possible and as early as possible.  With this, let’s dive into the solution.

Our software solution is a typical web application, using ASP.NET MVC, Enterprise Library Data Access blocks based DAL, stored procedures, business objects layer and services layer.   We use VS Database project to manage database changes and deploy to QA.  We use VS 2010 and TFS 2008.

I already wrote a TFSBuild.proj file that I use to publish to QA.  This build does the following tasks

  • Build the solution
  • Migrates the database using Visual Studio Database Project
  • Deploys the solution to QA web site on premises.

Our unit and integration tests are run by Continuous integration build, so we do not run unit tests as part of QA build.  I decided to use this QA build as my base. 

Everything I read about publishing to Azure from build ignored one important issue – what to do with the database.  Unfortunately, database project does not support publishing to Azure.  It is a sad fact give the importance Microsoft put on Azure.

I have been using Red Gate products for many years, almost 10 to be specific.  Red Gate SQL Compare to the rescue.  The newest version 10 has Azure support.  More specifically, it allows one to compare local SQL Server 200 R2 database to Azure SQL Server database and synchronize them. Cool.  So, here is the outline of my TFSBuild.proj file.  More specifically, here are the tasks at the high level.

Build Azure based solution.  I created the solution by copying existing solution, then adding Azure project to it.

  • Deploy database using database project to a SQL Server.
  • Use SQL Compare to synchronize this database with a database on Azure that was created at some point.  It can be just blank database, as SQL Compare will create the structure for us.
  • Create Post deployment script by combining all post deployment scripts from database project, using Script.PostDeployment.sql and its data as the guideline.  Run this script against Azure database using a utility.
  • Use Azure Cmdlets package from CodePlex to deploy package to Azure.

No, let’s dive into the details.

To deploy database project from TFS Build, you just need the following task in your proj file.

    <PropertyGroup>
        <TargetConnString>Data&#x20;Source=MyServer%3BUser&#x20;myUserID%3BPassword=myPassword</TargetConnString>

A few key things to notice.  You have to en

<Target Name="AfterDropBuild"> <Message Text="Starting to deploy database project"/> <MSBuild Projects="$(SolutionRoot)MySolutionMyDatabaseProjectMyDatabase.dbproj"
Properties="TargetDatabase=AZURE_TEST;Configuration=Release;OutDir=$(DropLocation)$(BuildNumber)Release;TargetConnectionString=$(TargetConnString);DeployToDatabase=true"
Targets="Deploy"/> <Message Text="Finished deploying database project"/>

code connection string, or you will get some errors.  You do have to add the property DeployToDatabase=true to your command line or it will not deploy.  You will see that the build does run successfully, just does not apply updates.

Now off to Azure synchronization.  I wrote a little command line utility I will call to do two tasks: synchronize with Azure SQL DB and run the scripts against it.  The code is very simple for this utility, it just launches SQLCompare.exe, which comes with Professional version of SQL Compare, then does some string parsing to create post deployment script and runs it in.  Here is what the method that synchronizes DB looks like:

        public static bool UpdateAzureDB(string logFileName)
        {
            var process = new Process();
            if (File.Exists(logFileName))
            {
                File.Delete(logFileName);
            }
            var commandLine =
                @" /db1:AZURE_TEST /p1:localPassword /u1:localUserID /s1:localServer /db2:AZURE_DB /p2:azurePassword " + 
                @"/u2:azureUserID /s2:********.database.windows.net /exclude:user /exclude:role " + 
                @"/r:" + logFileName + @" /rt:Simple /sync  /include:identical";
            var startInfo = new ProcessStartInfo(
                @"C:Program Files (x86)Red GateSQL Compare 10SqlCompare.exe", commandLine);
            process.StartInfo = startInfo;
            process.Start();
            process.WaitForExit();
            return process.ExitCode == 0;
        }

A couple of imporant notes.  We have to know if the process scusseeds, so we are trapping return value.  SQLCOmpare will return non-zero value if something goes wrong.  We are also creating a log file just in case in HTML format.  We are excluding database users and roles, since I do not need to synchronize those.

Next step is to create data massaging.  We are following database project best practices.  We are breaking down data massaging into multiple scripts, then combining them in Script.PostDeployment.sql.  Here is an example of that script:

/*
Post-Deployment Script Template                            
--------------------------------------------------------------------------------------
 This file contains SQL statements that will be appended to the build script.        
 Use SQLCMD syntax to include a file in the post-deployment script.            
 Example:      :r .myfile.sql                                
--------------------------------------------------------------------------------------
*/
:r .Table1.Data.SQL
:r .Table2.Data.SQL

My utility will accept the directory name where this file is located, will parse the file, and create one giant string with the content of all the sub-scripts.  Once that is done, it will use passed in connection string and will fire out the final script.  Again, code is pretty simple.

        public static void UpdateData(string folderName, string azureConnectionString)
        {
            var scriptData =
                File.ReadAllLines(string.Format(@"{0}{1}", folderName, "Script.PostDeployment.sql"));
            var finalScript = 
                (from line in scriptData 
                 where line.StartsWith(":r") 
                 select line.Substring(line.IndexOf(@"") + 1))
                    .Aggregate("", (current, fileName) => current + File.ReadAllText(string.Format(@"{0}{1}", folderName, fileName)));

            using (var connection = new SqlConnection(azureConnectionString))
            {
                connection.Open();
                using (var command = connection.CreateCommand())
                {
                    command.CommandType = CommandType.Text;
                    command.CommandText = finalScript;
                    var rowsAffected = command.ExecuteNonQuery();
                }
            }
        }

Now, I just have a wrapper utility that calls both methods in susccessing, traps errors and only returns zero if both methods execute properly.  I need this so that I can stop the build in case of non-zero exit code from my utility. To do so, I use Environment.Exit(0) or non-zero value in case of errors.  I also use Console.WriteLine,because this data will end up in the build log and will help me troubleshoot issues.  Now, I just need to call it from TFSBuild.proj:

    <PropertyGroup>
        <DatabaseUpdaterExecutable>UpdaterApp.exe</DatabaseUpdaterExecutable>
        <AzureConnectionString>"Data&#x20;Source=xxxxxxxx.database.windows.net%3BInitial&#x20;Catalog=AZURE_DB%3BPersist&#x20;Security&#x20;Info=True%3BUser&#x20;ID=remoteUserID%3BPassword=remotePassword"</AzureConnectionString>
        <AzureDeploymentReportFileLocation>C:TempAzureDBReport.html</AzureDeploymentReportFileLocation>

 

        <Message Text="Working on Azure DB"/>
        <Message Text="Executable: $(SolutionRoot)Deployment$(DatabaseUpdaterExecutable)"/>
        <Message Text="PostDeploymentScriptLocation: $(SolutionRoot)MySolutionMyDatabaseProjectScriptsPost-Deployment"/>
        <Message Text="AzureConnectionString: $(AzureConnectionString)"/>
        <Message Text="AzureDeploymentReportFileLocation: $(AzureDeploymentReportFileLocation)"/>
        <Exec Command="&quot;$(SolutionRoot)Deployment$(DatabaseUpdaterExecutable)&quot; &quot;$(SolutionRoot)MySolutionMyDatabaseProjectScriptsPost-Deployment&quot; $(AzureConnectionString) $(AzureDeploymentReportFileLocation)" ContinueOnError="false"/>
        <Message Text="Finished working on Azure"/>

Now, deploying the solution to Azure.  I ran into an issue installing Cmdlets.  I was getting “No snapins are installed error” when running PowerShell.exe.  What I ended up doing is installing both 32 and 64 bit versions of Cmdlets.  You can do that by opening installPSCmdlets.cmd that will be in your scripts folder after you install Cmdlets from CodePlex, and running it in both 32 and 64 bit code path.  You can just manually set powerShellDir variable to both values and running it twice.  Then do the same with installPSSnapIn.cmd.  Of course, make sure you do all that on your build server

Now, let’s update TFSBuild.proj file.  Here is the command in it:

    <PropertyGroup>
        <PackageName>MySolutionAzure.cspkg</PackageName>
        <ServiceConfigName>ServiceConfiguration.Cloud.cscfg</ServiceConfigName>
        <HostedServiceName>myServiceName</HostedServiceName>
        <Thumbprint>XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX</Thumbprint>
        <SubscriptionID>XXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXX</SubscriptionID>
        <Slot>Production</Slot>
        <StorageName>myStorageServiceName</StorageName>
    <Target Name="AfterDropBuild">

        <Message Text="Copying Azure configuration file"/>
        <Copy SourceFiles="$(SolutionRoot)ConfigurationAzureWeb.config"
            DestinationFolder="$(SolutionRoot)MySolutionMyWebProject" OverwriteReadOnlyFiles="true"/>
        
        <Message Text="Starting to build azure project for deployment"/>
        <MSBuild Projects="$(SolutionRoot)MySolutionMyAzureProjectMyAzureProject.ccproj" Properties="Configuration=Release;OutDir=$(DropLocation)$(BuildNumber)Releaseapp.publish;PackageForComputeEmulator=true;" Targets="Rebuild;Publish"/>
        <Message Text="Finished to build azure project for deployment"/>

        <Copy SourceFiles="$(SolutionRoot)MySolutionMyAzureProjectbinReleaseapp.publishMyAzureProject.cspkg"
            DestinationFolder="C:temp" OverwriteReadOnlyFiles="true"/>

        <Copy SourceFiles="$(SolutionRoot)MySolutionMyAzureProjectbinReleaseapp.publish\ServiceConfiguration.Cloud.cscfg"
            DestinationFolder="C:temp" OverwriteReadOnlyFiles="true"/>
        

        
        <Message Text="Starting to push package to Azure.  Package location follows"/>
        <Message Text="$(DropLocation)$(BuildNumber)Release"/>

        <Exec WorkingDirectory="$(SolutionRoot)Deployment" ContinueOnError="false" 
              Command=" $(windir)system32WindowsPowerShellv1.0powershell.exe -f AzureDeploy.ps1 C:temp $(PackageName) $(ServiceConfigName) $(HostedServiceName) $(Thumbprint) $(SubscriptionID) $(Slot) $(StorageName)"/>
        
        <Message Text="Done pushing package to Azure"/>

There are a couple of changes I made to ps1 file though.  You can find that file in a ton of places on the internet.  They all have the same issues.

  • No error handling
  • No checking to see if a depolyment already exists
  • They do not return non-zero exit code to signal to build that deployment failed.

You have to address these issues for clean deployment.

Here is my version:

$buildPath = $args[0]
$packagename = $args[1]
$serviceconfig = $args[2]
$servicename = $args[3]
$thumbprint = $args[4]

$cert = Get-Item cert:CurrentUserMy$thumbprint
$sub = $args[5]
$slot = $args[6]
$storage = $args[7]
$package = join-path $buildPath $packageName
$config = join-path $buildPath $serviceconfig
$a = Get-Date
$buildLabel = $a.ToShortDateString() + "-" + $a.ToShortTimeString()

write-host  "parameters"
write-host  $args[0]
write-host  $args[1]
write-host  $args[2]
write-host  $args[3]
write-host  $args[4]
write-host  $args[5]
write-host  $args[6]
write-host  $args[7]
write-host  "package data"
write-host  $package
write-host  $config
write-host  $servicename
  
#Important!  When using file based packages (non-http paths), the 
#cmdlets will attempt to upload the package to blob storage for 
#you automatically.  If you do not specifiy a –
#StorageServiceName option, it will attempt to upload a storage 
#account with the same name as $servicename.  If that
#account does not exist, it will fail.  This only applies to 
#file-based package paths.
$exitCode=0
Try
{
    $ErrorActionPreference = "stop"

    Add-PSSnapin WAPPSCmdlets
    Try
    {
        $hostedService = Get-HostedService $servicename -Certificate $cert -SubscriptionId $sub | Get-Deployment -Slot $slot

        if ($hostedService.Status -ne $null)
        {
            $hostedService |
              Set-DeploymentStatus 'Suspended' |
              Get-OperationStatus -WaitToComplete
            $hostedService | 
              Remove-Deployment | 
              Get-OperationStatus -WaitToComplete
        }
    }
    Catch
    {
        # eat errors as we may not have an active deployment
    }

    Get-HostedService -ServiceName $servicename -Certificate $cert -SubscriptionId $sub | New-Deployment -Slot $slot -Package $package -Configuration $config -Label $buildLabel -ServiceName $servicename -StorageServiceName $storage | 
        Get-OperationStatus -WaitToComplete

    Get-HostedService -ServiceName $servicename -Certificate $cert -SubscriptionId $sub | 
        Get-Deployment -Slot $slot | 
        Set-DeploymentStatus 'Running' | 
        Get-OperationStatus -WaitToComplete
}    
Catch
{
    $exitCode=-1
    write-host $error
}
exit $exitCode

I also added a few more thing.  I echo to the build some important information, which makes it much easier to debug the problems when they occur

To do initial push, I recommend you you Publish feature from studio, as it will handle all the myriad of certificates you need to get right to publish from the power shell.

Make sure you pick correct certificate for deployment.  If you are looking at your Windows Azure management console, you need the one listed under Management tab.  Look at the thumbprint, then find it on the machine that you did initial push for. I recommend using management console in Windows.  Type “mmc” in Start->Run.  Then add a Certificates Snap in, find it under the very first node, export it to PFX file using mmc, then import it on Build machine after you login with the same account your build runs under.  While on the subject, you should install ALL required software under that account just in case.  This includes Azure SDK, which you will need, and other dependencies from Cmdlets, etc…

Now, just schedule your build, and you are ready to go.

Leave a Reply

Your email address will not be published. Required fields are marked *