<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Geoff - Geoff's Blog]]></title><description><![CDATA[Net, Microsoft Azure, XAML and WP developer. Software problem solver. [Formerly http://geoffwebbercross.blogspot.co.uk]]]></description><link>http://webbercross.azurewebsites.net/</link><generator>Ghost 0.5</generator><lastBuildDate>Wed, 15 Apr 2026 00:31:53 GMT</lastBuildDate><atom:link href="http://webbercross.azurewebsites.net/author/geoff/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Copy a (db) file from an installed Android apk]]></title><description><![CDATA[<p>I've been looking at doing some performance tuning on a troublesome sql query on a sqlite db on an Android app and wanted tocopy the actual db to do some testing in an IDE.</p>

<p>It took a long time reading various SO articles to find something that worked, but eventually found <a href='https://stackoverflow.com/questions/18471780/android-adb-retrieve-database-using-run-as' >this</a> (answer from Triplee).</p>

<p>Basically you need to run adb shell, <code>run-as</code> the package to impersonate the permissions, then use <code>cat</code> to pipe the file to another location on the device (sdcard which is available on the emulator), then use <code>adb pull</code> to copy to the host machine, so steps are:</p>

<ol>
<li>Open adb command prompt (you can do this through Visual Studio)  </li>
<li><code>adb shell</code>  </li>
<li><code>run-as com.package.name</code>  </li>
<li><code>cat foldername/dbfilename.db &gt; /sdcard/dbfilename.db</code>  </li>
<li><code>exit</code>  </li>
<li><code>exit</code>  </li>
<li>adb pull /sdcard/dbfilename.db "c:\users\youruser\desktop\dbfilename.db"</li>
</ol>

<p>Works nicely!</p>]]></description><link>http://webbercross.azurewebsites.net/copy-a-db-file-from-an-installed-android-apk/</link><guid isPermaLink="false">bc0accfe-46d3-41bf-a885-b0abdad178c7</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Wed, 12 Sep 2018 10:30:21 GMT</pubDate></item><item><title><![CDATA[Connecting to a TFS Git Repository]]></title><description><![CDATA[<p>Yesterday I started working with a client who uses TFS with Git. My outlook.com account was added to the repo and Visual Studio 2017 connected fine once I was logged-in with this account. <br />
I then wanted to pull the repo Visual Studio for Mac and discovered it's impossible using the Microsoft account outlook.com credentials - I just got unathorized messages.</p>

<p>I tried it in Git Bash too and had the same results. I think when Visual Studio connects on Windows it must stash a token which it then uses to authenticate itself and never uses the Microsoft account username and password.</p>

<p>Normally when I connect to GitHub or BitBucket using HTTPS, I only use a simple user name and password combiation so figured it must be possible, but not with the Microsoft email address. After a fair amount of googling, I discovered that there is the concept of <strong>Alternate authentication credentials</strong></p>

<p>If you login to .visualstudio.com and click <strong>Manage security</strong> next to the repository:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2018/Feb/git1.png'  alt="Manage security" /></p>

<p>You get the security menu where you can select Alternate authentication credentials:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2018/Feb/git2-1.png'  alt="Manage security" /></p>

<p>In here, you can create your <strong>Alternate authentication credentials</strong>:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2018/Feb/git3.png'  alt="Alternate auth" /></p>

<p>Once saved, the .visualstudio.com Git repository can be cloned either from Git or Visual Studio for Mac using these credentials.</p>]]></description><link>http://webbercross.azurewebsites.net/connecting-git-to-a-tfs-git-repository/</link><guid isPermaLink="false">76ec44b0-65b0-41af-ac8e-a3c45f0ad4fc</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Sat, 17 Feb 2018 13:39:39 GMT</pubDate></item><item><title><![CDATA[Xamarin Resx Localisation Weirdness]]></title><description><![CDATA[<p>This was a bit of a head scratcher, I've added localisation with 20+ languages to a project and found using the following kind of notation:</p>

<pre><code>AppResource.cs-CZ.resx
AppResource.es-ES.resx
...etc
</code></pre>

<p>The resource files were set with the following build properties:</p>

<pre><code>Build Action: Embedded Resource
Copy to Output Directory: Do not copy
Custom Tool: PublicResXFileCodeGenerator
</code></pre>

<p>In debug mode, the resources loaded fine and my localised text was appearing in the UI. however when built in Release mode, the resources we not loading and the default .resx (English) was being used.</p>

<p>After lots of messsing around with various settings, I remembered that a previous build had worked with a test Spanish file:</p>

<pre><code>AppResource.es.resx
</code></pre>

<p>Now this file didn't have a Spanish variation but it worked.</p>

<p><strong>Removing the language variations from the ends worked for release mode!</strong> Unfortunately I wanted simlified and traditional Chinese, so had to settle for just siplified.</p>]]></description><link>http://webbercross.azurewebsites.net/xamarin-resx-localisation-weirdness/</link><guid isPermaLink="false">59d3e318-c80c-4fdd-93b7-d56fa77c7d68</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Sat, 16 Dec 2017 12:14:32 GMT</pubDate></item><item><title><![CDATA[Xamarin MasterDetailPage IsGestureEnabled Android Bug Hack]]></title><description><![CDATA[<p>I've implemented a <strong>MasterDetailPage</strong> in an app which is used on one page for filtering a list. It's advised that it's placed at root level which means it's available for use on all pages.</p>

<p>Because on most pages it's not needed, the flyout master page is not wanted, but can be opened by swiping from the left margin.</p>

<p>There is a property on the control called <strong>IsGestureEnabled</strong> which should control whether the swipe gesture works or not, so I decided to apply it once and only have a button on the page control the <strong>IsPresented</strong> property to show and hide the master page.</p>

<p>This works fine on iOS, but not on Android. Initially the gesture is disabled, but as soon as the property changes, the gestures become active and start allowing swipe from left top open, once closed. I've not looked at the DrawerLayout, but I'm assuming the gestures are enabled to allow the user to swipe close, however the state is not reverted afterwards.</p>

<p>I had a look at the <a href='https://github.com/xamarin/Xamarin.Forms/blob/master/Xamarin.Forms.Platform.Android/Renderers/MasterDetailRenderer.cs' >Renderer</a> code and noticed on the <strong>IsGestureChanged</strong> property handling in the <em>HandlePropertyChanged</em> method, it calls onto a method called <em>SetGestureState</em> which sets the drawer locker mode:</p>

<pre><code>void SetGestureState()
{
    SetDrawerLockMode(_page.IsGestureEnabled ? LockModeUnlocked :         LockModeLockedClosed);
}
</code></pre>

<p>So at first I was thinking I'd create a custom renderer to try and permenantly disable the gestures, but then I realised if we just toggle the <strong>IsGestureEnabled</strong> property, to make the property change and the method will get called and lock the drawer close.</p>

<p>I hooked the toggling onto the <strong>IsPresentedChanged</strong> event in the code-behind of my <em>MasterDetailPage</em> instance like this:</p>

<pre><code>// Hack to re-disable drawer after close. Calls SetDrawerLockMode on Android
if(Device.RuntimePlatform == Device.Android)
{
    IsPresentedChanged += (s, e) =&gt;
    {
        if (IsPresented) return;

        IsGestureEnabled = true;
        IsGestureEnabled = false;
   };
}
</code></pre>

<p>...and it works great!</p>]]></description><link>http://webbercross.azurewebsites.net/xamarin-masterdetailpage-isgestureenabled-android-bug-hack/</link><guid isPermaLink="false">3cc8f1dc-04dd-4b88-a1f1-0ee949a8f4e2</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Thu, 14 Dec 2017 23:39:50 GMT</pubDate></item><item><title><![CDATA[New Android SDKs not showing up in VS 2017]]></title><description><![CDATA[<p>I just downloaded Android 26 (8.0) and 27 (8.1) SDKs with the SDK manager to align my system with another dev working on the same project and found the SDKs don't show up in the Android project properties target platform.</p>

<p>I checked to make sure there were no updates in Visual Studio <strong>Tools/Extensions and Updates</strong> manager and there were none. So was still puzzled.</p>

<p>I thought maybe VS wasn't showing what was in the platforms folder:</p>

<pre><code>C:\Program Files (x86)\Android\android-sdk\platforms
</code></pre>

<p>So I went and renamed some of my older SDKs and sure enough, the renamed folders disappeared, so it was reading the SDK versions OK.</p>

<p>So maybe there was a VS update I wasn't getting? After a bit of googling I found that you can manually check for an update via the VS installer:</p>

<ul>
<li>Locate the installer under <strong>Apps &amp; Features</strong></li>
<li>Click <strong>Modify</strong></li>
<li>If there is an update available the <strong>Update</strong> button will be available</li>
</ul>

<p>Once updated the new SDKs appeared :-)</p>]]></description><link>http://webbercross.azurewebsites.net/new-android-sdks-not-showing-up-in-vs/</link><guid isPermaLink="false">7e1b9ce7-99f7-4994-9379-2b63648b182d</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Tue, 05 Dec 2017 10:29:32 GMT</pubDate></item><item><title><![CDATA[Xamarin Droid Resource Error]]></title><description><![CDATA[<p>I innocently added an image to Resources/Drawable. The problem was, I had a dash in the file name (filter-applied.png). The resource packaging didn't like this resulting in the following build error:</p>

<pre><code>    The file "obj\Debug\android\bin\packaged_resources" does not exist.
</code></pre>

<p>Removing the dash and removing, ore replacing with an underscore fixed the problem.</p>]]></description><link>http://webbercross.azurewebsites.net/xamarin-droid-resource-error/</link><guid isPermaLink="false">c4b52431-db0a-4c16-9060-c2829a3b39aa</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Wed, 29 Nov 2017 17:48:16 GMT</pubDate></item><item><title><![CDATA[EF7 SQLite Click-Once Deployment Error]]></title><description><![CDATA[<p>I'm working on a WPF client app with a SQLite database (view to build an iOS app with Xamarin in the future).</p>

<h3 id="theerror">The Error</h3>

<p>The application works fine locally, but when I publish and create a Click-Once deployment, the application was not launching and I was getting this error in the Windows EventViewer:</p>

<pre><code>Application: My.App.exe Framework Version: v4.0.30319 Description: The process was terminated due to an unhandled exception. Exception Info: System.DllNotFoundException at Microsoft.Data.Sqlite.Interop.NativeMethods.sqlite3_open_v2(IntPtr, Microsoft.Data.Sqlite.Interop.Sqlite3Handle ByRef, Int32, IntPtr) at Microsoft.Data.Sqlite.Interop.NativeMethods.sqlite3_open_v2...
</code></pre>

<p>This means that the interop DLLs can't find the native SQLite DLLs, so can't run.</p>

<p>I've installed <strong>EntityFramework.Sqlite</strong> NuGet packages which has <strong>Microsoft.Data.Sqlite</strong> as a pre-requisite containing the SQLite interop. If we take a look in the packages folder, we can see the native DLLs live under the runtimes folder:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2016/Mar/sqlite1.png'  alt="Package folder" /></p>

<p>Now, if we look in our project target directory, we notice there is an x86 and x64 sub-directory with these DLLs in:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2016/Mar/sqlite2.png'  alt="Target folder" /></p>

<p>Now, when we look in the deployed click-once application which will be in a directory like this:</p>

<pre><code>C:\Users\geoff\AppData\Local\Apps\2.0\WPBJ2B04.BKA\PN5A1WW5.2WE\ques..tion_7b7f6e03bc292b23_0001.0000_24569b1cbc6fe91b
</code></pre>

<p>The x86 and x64 directories are missing, in fact the DLLs aren't even in the root.</p>

<h3 id="solution">Solution</h3>

<p>The solution is simple. We just need to deploy the x86 and x64 directories with the other published binaries. To do this we do the following:</p>

<ul>
<li>Add an x86 and an x64 folder to the visual studio project</li>
<li>Link the native binaries from the package folder (you could add them if you wanted):
<img src='http://webbercross.azurewebsites.net/content/images/2016/Mar/sqlite3.png'  alt="DLL folders" /></li>
<li>We can see that these DLLs will be deployed before we publish, opening the project properties, then clicking <strong>Application Files</strong> on the <strong>Publish</strong> tab:
<img src='http://webbercross.azurewebsites.net/content/images/2016/Mar/sqlite4.png'  alt="App Files" />
Now when we publish and run the app, the native DLLs are available and it launches fine.</li>
</ul>]]></description><link>http://webbercross.azurewebsites.net/ef7-sqlite-click-once-deployment-error/</link><guid isPermaLink="false">ed2d58a9-6216-44d8-82b7-4f86acbdc7fd</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Mon, 07 Mar 2016 16:56:46 GMT</pubDate></item><item><title><![CDATA[Restoring Windows 8.1 after Windows 10 Issues]]></title><description><![CDATA[<p>This may be useful if you've installed Win10, but need to quickly get back to a usable machine without spending an entire weekend re-paving it!</p>

<p>I installed (upgrade keeping files/apps rather than clean install) Windows 10 on my work machine on Thursday night last week, then found on Friday my existing VPN connection failed with a message like "Application not found" and the OS went off to the Windows Store to look for my VPN!</p>

<p>I deleted the existing VPN connection and created a new one, only to receive "A certificate chain processed, but terminated in a root certificate which is not trusted by the trust provider" - I've seen this issue before on my machine after a crash and Windows not shutting down properly - I managed to fix it by rolling back to a system restore point, however the machine is on a new operating system!</p>

<p>Luckily, there is an option in Updates &amp; Security/recovery to "Go back to Windows 8.1" (or whatever OS it came from) this option is available for month as lone as long as disk cleanup utility hasn't been run to remove previous installation files. I returned the machine to 8.1 which took about 1/2 an hour and started fine, although I still got the root certificate error again.</p>

<p>I was preparing myself to spend all weekend re-paving my machine, but remembered the problem I'd had previously and looked in System Restore (Type recovery on Win 8.1 start screen, then go to "Open System Restore"). Luckily there we still some restore points and I was able to restore to a point just before installing Windows 10.</p>

<p>After another hour or so, the OS was restored to before Windows 10 and my VPN connected fine and I was able to work again!</p>]]></description><link>http://webbercross.azurewebsites.net/restoring-windows-8-1-after-windows-10-issues/</link><guid isPermaLink="false">56dbb8f0-8eaa-4553-86b3-b7ed2e467295</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Mon, 10 Aug 2015 08:42:08 GMT</pubDate></item><item><title><![CDATA[EF6 IQueryable Deferred Execution and Select]]></title><description><![CDATA[<p>I've recently been building a new EF data access layer for our system at work and have obviously been asked a lot of questions about how it compares to the flexibility of using the existing native ADO.Net framework. One question was - where there are tables with a large number of fields (100s) and we're only interested in a subset of these, does EF pull back all the fields? I was pretty sure if you query a DataSet using Select, the deferred execution on the IQueryable interface would return the minimum data required for the Select projection. These tests prove it (I used SQL Server Profiler to capture the SQL):</p>

<p>For a simple select statement to return the IDs from a table like this:</p>

<pre><code>var ctx = new DemoContext();
var item = ctx.Line.Select(l =&gt; l.ID).ToList();
</code></pre>

<p>The following SQL is executed</p>

<pre><code>SELECT 
[Extent1].[ID] AS [ID]
FROM [dbo].[Line] AS [Extent1]    
</code></pre>

<p>We can see that only the ID field is returned.</p>

<p>Now another example, where we are projecting the results into an anonymous object:</p>

<pre><code>var item = ctx.Line.Select(l =&gt; new { id = l.ID, name = l.Name }).ToList();
</code></pre>

<p>And the resultant SQL:</p>

<pre><code>SELECT 
1 AS [C1], 
[Extent1].[ID] AS [ID], 
[Extent1].[Name] AS [Name]
FROM [dbo].[Line] AS [Extent1]    
</code></pre>

<p>Here we get the fields required for the anonymous object and an extra C1 field which is a dummy PK for the object.</p>]]></description><link>http://webbercross.azurewebsites.net/ef6-iqueryable-deferred-execution-and-select/</link><guid isPermaLink="false">211e8c55-3123-4eab-9fb8-144084bdd01a</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Wed, 15 Jul 2015 08:08:12 GMT</pubDate></item><item><title><![CDATA[Implementing -whatIf in a PowerShell function]]></title><description><![CDATA[<p>I've got a function in a module I use to deploy binaries and files to various websites and wanted to usethe -whatIf switch on the function and have all the new-Item and copy-Item commandlets pickup the switch.</p>

<p>It took a bit of working out from various sources how to do it. It's actually really simple. Before the param declaration in the function, simply add this cmdltbinding:</p>

<pre><code>[cmdletbinding(SupportsShouldProcess=$True)]
</code></pre>

<p>And the -whatIf (and -confirm) switch is passed through the pipeline to the commandlets.</p>

<p>So the function looks like this:</p>

<pre><code>function copy-Files
{
    [cmdletbinding(SupportsShouldProcess=$True)]
    param(
        $myParam        
    )
    # Function here
}
</code></pre>

<p>That's it!</p>]]></description><link>http://webbercross.azurewebsites.net/implementing-whatif-in-a-powershell-function/</link><guid isPermaLink="false">2a32204a-d9b2-46d0-a5c8-5a9b586b19e4</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Thu, 18 Jun 2015 12:27:50 GMT</pubDate></item><item><title><![CDATA[Error 0x800b0109 a certificate chain processed but terminated in a root certificate]]></title><description><![CDATA[<p>My 8.1 machine locked-up yesterday and I had a lot of trouble getting it to restart...once it did it froze for a while, then seemed to return to normal fter around 10 minutes. After this my VPN connection stopped working (stuck on Verifying Credentials). I deleted the VPN connection and recreated after which I got this error:</p>

<p><strong>error 0x800b0109 a certificate chain processed but terminated in a root certificate</strong></p>

<p>I tried recreating it a number of times and rebooting machine etc, but still seeing same error whatever I did.</p>

<p>I think the error is indicating that a certificate is not installed on the machine...I suspect it might have been corrupted.</p>

<p>Anyway, my fix was to simply restore my machine to a point 2 days previous before the issue occurred using <em>System Restore</em>: <a href='http://windows.microsoft.com/en-GB/windows-8/restore-refresh-reset-pc' >http://windows.microsoft.com/en-GB/windows-8/restore-refresh-reset-pc</a></p>]]></description><link>http://webbercross.azurewebsites.net/error-0x800b0109-a-certificate-chain-processed-but-terminated-in-a-root-certificate/</link><guid isPermaLink="false">9abd48d8-aa45-45b6-87c5-25d74994d16b</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Fri, 23 Jan 2015 11:50:08 GMT</pubDate></item><item><title><![CDATA[Learning Microsoft Azure]]></title><description><![CDATA[<p>After 6 months of hard work and not much sleep, my new book <strong>Learning Microsoft Azure</strong> has been published by <a href='http://packtpub.com/' >Packt Publishing</a> and is available from <a href='https://www.packtpub.com/virtualization-and-cloud/learning-microsoft-azure' >the packt store</a> and <a href='http://www.amazon.com/Learning-Windows-Azure-Geoff-Webber-Cross/dp/1782173374/' >Amazon</a>.</p>

<p><img src='https://www.packtpub.com/sites/default/files/3373EN_Learning%20Windows%20Azure_cov_0.jpg'  alt="Cover" /></p>

<p>This is the Preface, which gives you a good idea of what's in the book:</p>

<p>Learning Microsoft Azure is a practical, hands-on book for learning how to build systems for Microsoft Azure. The book is themed around an enterprise case study based on a fictional industrial bakery called Azure Bakery which spans 3 business units: Sales, Production and Supply. The entire system is built on Microsoft Azure technology utilizing a broad range of services. </p>

<p>The Sales business unit is responsible for selling products to customers through the MVC5 Customer website where customers can make orders and view their status as the order moves through the system. Products are managed through another administrator website which implements Azure Active Directory authentication. A Windows Phone app with .Net Mobile service and Twitter authentication integrated with the Customer website allows customers to view order status on their phone and receive push notifications via the Notifications Hub when order status changes and new products are created. The Sales system has its own dedicated SQL Azure database and communicates with the other systems via a Service Bus Topic. A worker role is implemented to keep the Sales system updated as orders are processed through the enterprise system.</p>

<p>The Production business unit is responsible for manufacturing the products for the customer orders and has a worker role at the core of it which consumes customer orders from the Service Bus Topic, enters the orders into the Production SQL Azure database, creates batch schedules for baking products and allocates stock in the system. Production staff use an on-premises WPF client application with Azure Active Directory authentication to view batch schedules and manage stock via a Web API 2 service with SignalR hub and Azure Service Bus backplane allowing client applications to update in real-time.</p>

<p>The Supply business unit is responsible for picking and packing orders from the Production business unit and delivering them to the customers. A worker role consumes orders from the Service Bus Topic and stores customer details in Table Storage and automatically creating barcode labels stored in BLOB storage. Supply staff interact with the system via an Enterprise Windows Store app which is authenticated with Azure Active Directory and has a .Net Mobile Service backend. <br />
As we’re building the system we learn about the topic we’re exploring and apply it to our system with detailed walk-throughs and relevant code samples. There are full working code samples for the entire system broken down chapter-by-chapter. <br />
What this book covers</p>

<h3 id="chapter1gettingstartedwithmicrosoftazure">Chapter 1 - Getting Started with Microsoft Azure</h3>

<p>An introduction to cloud computing and Microsoft Azure followed by how to choose a subscription and signing up for a subscription. We finish the chapter with a look around the portal and start looking at the different services Microsoft Azure has to offer.  </p>

<h3 id="chapter2designingasystemformicrosoftazure">Chapter 2 - Designing a System for Microsoft Azure</h3>

<p>Designing scalable resilient systems for Microsoft Azure looking at methodologies for breaking systems into sub-systems and selecting appropriate Azure services to build them. The process will be applied to designing a small system for an independent stationers requiring a website and basic administration system, then extended to a full enterprise system where will we introduce the Azure Bakery case study.  </p>

<h3 id="chapter3startingdevelopingwithmicrosoftazure">Chapter 3 - Starting Developing with Microsoft Azure</h3>

<p>This is the first taste of developing for Microsoft Azure, where the reader will prepare their development environment with the required tools and sign-up for a Visual Studio online account. We’ll create the foundations of the Sales Customer website and publish it to the cloud, then setup continuous deployment using Visual Studio Online Team Foundation build.  </p>

<h3 id="chapter4creatingandmanaginganazuresqlserverdatabase">Chapter 4 - Creating and Managing an Azure SQL Server Database</h3>

<p>We’ll create a database for the Sales business unit and build it using Entity Framework Code-First migrations. The chapter will examine different tools for working with the database from a developer and administrator point of view and look at options for database backup.  </p>

<h3 id="chapter5buildingazuremvcwebsites">Chapter 5 - Building Azure MVC Websites</h3>

<p>In this chapter the Sales Customer website and Administrator website are built with Twitter authentication for the Customer site and Azure Active Directory authentication for the Administrator site. We learn how to apply custom domain names and SSL certificates to Azure websites and learn how to do Azure AD group authorization in an MVC website.  </p>

<h3 id="chapter6azurewebsitediagnosticsanddebugging">Chapter 6 - Azure Website Diagnostics and Debugging</h3>

<p>This chapter follows on from the previous chapter, exploring techniques and tools to help diagnose problems and debug Azure websites. We’ll look at enabling diagnostics in websites, working with log files and examining application logging and site diagnostics. Finally we’ll look at the Kudu service and remote debugging Azure websites.  </p>

<h3 id="chapter7azureservicebustopicintegration">Chapter 7 - Azure Service Bus Topic Integration</h3>

<p>Starting with an overview of Service Bus Topics and creating a Topic for handling order messaging between the 3 business tiers, we’ll integrate the Sales Customer website into the Topic with a Subscription allowing newly created orders to be sent across the system where they wil be collecte by the Production system for manufacturing and the Supply system for producing address labels and planning deliveries. We’ll also create a messaging simulator created to allow the Topic to be loaded-up with high volumes of orders to help test the scalability and capacity of the system. Finally we’ll look at the features in the portal to help us monitor and manage our Service Bus Topic.  </p>

<h3 id="chapter8buildingworkerroles">Chapter 8 - Building Worker Roles</h3>

<p>After an introduction to cloud services and creating a worker role, we’ll create and run a basic cloud service locally on the compute emulator, then publish and run it in cloud. The Production Order processor is created next, which is responsible for receiving orders from the service Bus Topic, saving them to the Production database and creating product batch schedules and allocating stock. Finally we’ll test the cloud service in a scaled deployment using the simulator created in Chapter 7.  </p>

<h3 id="chapter9cloudservicediagnosticsdebuggingandconfiguration">Chapter 9 - Cloud Service Diagnostics, Debugging and Configuration</h3>

<p>This chapter follows on from the previous chapter, covering diagnostics, remote debugging and Intellitrace. We’ll learn how to deal with configuration changes made in the portal at runtime and implement startup tasks for performing customisations for preparing the server environment for the service.  </p>

<h3 id="chapter10webapiandclientintegration">Chapter 10 - Web API and Client Integration</h3>

<p>An introduction to Web API and Signal R with an Azure Service Bus backplane followed by building a Web API Service and SignalR hub to allow the Production Management application to interact with the Production database and Service Bus Topic. The system will be authenticated with Azure AD authentication allowing production staff to login to the WPF client application using their Azure AD credentials.  </p>

<h3 id="chapter11integratingamobileapplicationusingmobileservices">Chapter 11 - Integrating a Mobile Application using Mobile Services</h3>

<p>This chapter brings the whole system together with the addition of a Mobile Service and Windows Phone 8 application for the Sales system which allows users to login with the same credentials as the Customer website, view orders and receive order updates and product news via the Notifications Hub. The sales Mobile Service provides APIs for the Admin website and Order processor to interact with the Notifications Hub. Finally the chapter looks at building an Azure AD authenticated Mobile Service for the Supply Windows Store application for viewing orders and retrieving address labels from BLOB storage created by the Supply Order Processor.  </p>

<h3 id="chapter12preparingandazuresystemforproduction">Chapter 12 - Preparing and Azure System for Production</h3>

<p>This final chapter looks at configuring systems for various environments including production and creating publishing packages using Visual Studio Online Team Foundation build server and producing database scripts in order for the system deployments to be managed in a controlled way by systems administrators or developers. We’ll learn about monitoring the different services implemented throughout the book once they are live and also cover guidelines for publishing web-connected mobile applications.</p>]]></description><link>http://webbercross.azurewebsites.net/learning-microsoft-azure/</link><guid isPermaLink="false">93eab6e1-48f9-4152-8658-7b35b72cf44a</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Mon, 20 Oct 2014 09:52:48 GMT</pubDate></item><item><title><![CDATA[Local Debugging an Azure Mobile Service with AD Auth]]></title><description><![CDATA[<p>I'm finishing off one of the last chapters which is on Mobile Services for my new book Learning Microsoft Azure for <a href='http://packtpub.com/' >Packt Publishing</a>. The book is based on an enterprise case study spanning multiple business domains and uses Azure Active Directory for authenticating the internal (used by company staff rather than the public) systems. <br />
Using AD authentication is slightly different to the other OAuth providers such as Twitter and Facebook as the MobileServiceClient.LoginAsync method requires an access token obtained from the AuthenticationContext.AcquireTokenAsync method which calls the mobile service /login/aad endpoint. There are some good <a href='http://azure.microsoft.com/en-us/documentation/articles/mobile-services-windows-store-dotnet-adal-sso-authentication/' >examples</a> of doing this, however there is no detail on debugging the service locally.</p>

<h2 id="theproblem">The Problem</h2>

<p>In order to call the /login/aad endpoint locally we need to enable HTTPS on web server (IIS Express) and use the login URL like this <a href='https://localhost:12345/login/aad' >https://localhost:12345/login/aad</a> with the AuthenticationContext. The Windows Store Application doesn't like this endpoint as it's got a temporarly SSL certificate without a trusted root and we don't have the option to absorb untrusted certificates like we can in .Net clients using the ServicePointManager.ServiceCertificationValidationCallback like this: <br />
ServicePointManager.ServerCertificateValidationCallback += (se, cert, chain, sslerror) => true;</p>

<p>So we get an exception with message: The underlying connection was closed. Could not establish trust relationship for the SSL/TLS secure channel when we call AuthenticationContext.AcquireTokenAsync. <br />
I had a play about with the capabilities in the Package.appxmanifest to see if anything could be done to allow the certificate. I enabled "Shared User Certificate" which got rid of the exception, but then came back with an 401 Unauthorized exception on LoginAsync which I think is because this setting is for using an SSL cert for securing comms between the client and <a href='http://msdn.microsoft.com/en-us/library/windows/apps/dn448938.aspx' >custom web services</a>.</p>

<h2 id="possibleoptions">Possible Options</h2>

<p>I don't think there is an easy way of easily getting this to work locally. We could apply a trusted SSL certificate to our local service, however this requires purchasing one in the first place and second requires additional configuration. The other option would be to create a temporary certicate and configure that on the local web server and configure certificates in the Package.appxmanifest which would need removing on publish.</p>

<h2 id="workaround">Workaround</h2>

<p>This was bugging me for a while and I was close to giving up and just debugging against Azure, but I suddenly though we might be able to authenticate against a published service on Azure, then use the credentials with a client pointing at our local service...well, it works! <br />
First thing to do is publish the Mobil Service, then modify the web.config app settings. Use the same keys as the published service to save switching them in code:</p>

<p><code>&lt;add key="MS_MasterKey" value="YYYYYYYYYYYYYXXXXXXXXXXXXXXXXXXXXxxxxxxxxxxxxxx" /&gt; &lt;add key="MS_ApplicationKey" value="YYYYYYYYYYYYYXXXXXXXXXXXXXXXXXXXXxxxxxxxxxxxxxx" /&gt;</code></p>

<p>These settings aren't well documented, but add these with the MS<em>AadClient id being the AD application for the Azure service (<a href='http://azure.microsoft.com/en-us/documentation/articles/mobile-services-windows-store-dotnet-adal-sso-authentication/' >see this example</a>) and the MS</em>AadTenants setting the AD tenant domain:</p>

<p><code>&lt;!-- Manually add these --&gt; &lt;add key="MS_AadClientID" value="XXXXXXXXXXXyyyyyyyyyyy" /&gt; &lt;add key="MS_AadTenants" value="myservice.onmicrosoft.com" /&gt;</code></p>

<p>In my auth base class I have something like this at the top:</p>

<pre><code>private const string authority = "https://login.windows.net/myapplication.onmicrosoft.com";
private const string clientID = "xxxxxxxxxxxxxYYYYYYYYYYYYY";

private const string azureUri = "https://myservice.azure-mobile.net";

private const string resourceURI = "https://myservice.azure-mobile.net/login/aad";

#if DEBUG
protected readonly static MobileServiceClient _mobileService = new MobileServiceClient(
    "http://localhost:58932/",
    "XXXXXXXXXXXXXXXyyyyyyyyyyyyyyyyy"
    );
#else
protected readonly static MobileServiceClient _mobileService = new MobileServiceClient(
    "https://myservice.azure-mobile.net",
    "XXXXXXXXXXXXXXXyyyyyyyyyyyyyyyyy"
    );
#endif
</code></pre>

<p>When we're in DEBUG mode, we're using the local URL (HTTP not HTTPS) which will allow us to debug the service once we've logged into out AD Tenant. <br />
Not at the login step, if we're in DEBUG mode, we quickly new-up a MobileServiceClient pointing to the published Mobile Service, login, the copy the user object into our static MobileServiceClient to be used by all classes implementing the base class:</p>

<pre><code>var ac = new AuthenticationContext(authority);
var ar = await ac.AcquireTokenAsync(resourceURI, clientID);
var payload = new JObject();
payload["access_token"] = ar.AccessToken;

MobileServiceUser user = null;
#if DEBUG
// Create temp client to loging to azure service, then pass user to local client
var client = new MobileServiceClient(azureUri, _mobileService.ApplicationKey);
user = await client.LoginAsync(_provider, payload);

// Swap user
_mobileService.CurrentUser = user;
#else
// Login
user = await _mobileService.LoginAsync(_provider, payload);
#endif
</code></pre>

<h2 id="conclusion">Conclusion</h2>

<p>This workaround is really simple to implement and works really well, it's not really exploiting a security loophole either because even though we're getting an authentication token from a different service, the local service is still validating the token against the AD application with the same identifiers as the published version.</p>]]></description><link>http://webbercross.azurewebsites.net/local-debugging-an-azure-mobile-service-with-ad-auth/</link><guid isPermaLink="false">5bcbaf14-845e-409b-8b4b-e23f6c62e763</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Mon, 28 Jul 2014 10:07:00 GMT</pubDate></item><item><title><![CDATA[Manually Installing a Neo4j Database on an Azure VM]]></title><description><![CDATA[<p><a href='http://neo4j.com/' >Neo4j</a> is a graph database used for storing connected data in a graph rather than a traditional relational database such as SQL Server or Oracle. This article talks about the manual process for hosting a Neo4j database on an Azure virtual machine and configuring ACL restricted remote connection. This is an IaaS (Infrastructure as as Service) deployment as we're manually creating a VM and installing and configuring the software on there. 
There are some other good articles on deploying Neo4j to Azure here, but we'll go through how it all fits together and go into more detail about ACL here. <br />
It's possible to deploy Neo4j as a cloud service (PaaS) with startup tasks to do the installation, however there are still a number of manual modifcations to make, so unless you need a number of deployments, it may not be worth the extra overhead.</p>

<h2 id="creatingavm">Creating a VM</h2>

<p>First of all we need to create a VM on Azure so login to the portal (I'm using the old portal as the new one isn't feature complete yet), go to the Virtual Machines workspace, then click the +NEW button in the toolbar and fill in the new VM details: <br />
<img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/1.png'  alt="portal" /></p>

<p>The size of the VM is quite important for a Neo4j server as we can scale-up (more CPU, memory etc) but not-scale-out (multiple instances) because the database engine can't currently partitioned.</p>

<h2 id="downloadinginstallers">Downloading Installers</h2>

<p>Once the VM is created, click the CONNECT button on the VM workspace toolbar to download the RDP (Remote Desktop Protocol) shortcut which you can use to remote desktop to the server. Accept all the warnings an you should see the server desktop. <br />
Go to the <a href='http://neo4j.com/download/' >Neo4j download site</a> and locate the zip version so we can install it as a service. It's a pain to download things on a sever as IE is quite tightly locked down, so the easiest way is to use the PowerShell BitsDownloader module. To do this, open PowerShell from the server desktop or start menu and enter the following command to load the module: <br />
<code>import-module bitstransfer</code> Then enter a command like this to download from a URL to a location on the server: 
<code>start-bitstransfer -source http://dist.neo4j.org/neo4j-community-2.1.2-windows.zip -destination c:\users\geoff\downloads\neo4j-community-2.1.2-windows.zip</code> A nice progress indicator appears as the file downloads. My console looked like this: 
<img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/2.png'  alt="bitsdownloader" /></p>

<p>Neo4j as the j suggests runs on JRE so we need to download that too. I got an installer from a mirror site as the Oracle link doesn't work with bittransfer due to the usage policy: <br />
<code>start-bitstransfer -source http://java64.net/download/jre-8u11-windows-x64.exe -destination c:\users\geoff\downloads\jre-8u11-windows-x64.exe</code></p>

<h2 id="installingtheservice">Installing the Service</h2>

<p>Follow this procedure to install the service:</p>

<ol>
<li>Extract the contents of the zip file and copy to where you want the service to run from  </li>
<li>Open a command prompt (just type cmd on start screen)  </li>
<li>Change directory to the neo4j bin folder and run Neo4jInstaller.bat install.  </li>
<li>The service can be started with the following command: <br />
<code>sc start neo4j-server</code></li>
<li>The service can be stopped with the following command: <br />
<code>sc stop neo4j-server</code></li>
<li>By default it's set to start automatically when the server starts.  </li>
<li>If we want to allow remote connections (which is highly likely)we need to allow the server to accept connections from remote machines so stop the service, then open the following config in notepad: <br />
<code>conf\neo4j-server.properties</code></li>
<li>Find the following section: <br />
<code># Let the webserver only listen on the specified IP. Default is localhost (only 
# accept local connections). Uncomment to allow any connection. Please see the # security section in the neo4j manual before modifying this. #org.neo4j.server.webserver.address=0.0.0.0</code></li>
<li>Uncomment the last line and save the file: <br />
org.neo4j.server.webserver.address=0.0.0.0  </li>
<li>If you want to change the port, you can do this here too  </li>
<li>Start the service again  </li>
<li>Test the browser page opens locally in the browser: <br />
<code>http://localhost:7474/browser</code></li>
</ol>

<h2 id="configuringtheserverfirewall">Configuring the Server Firewall</h2>

<p>We need to add a firewall rule to allow remote machines to connect to port 7474. Follow this procedure to do this: <br />
1. Open Windows Firewall with Advanced Security from the Server Manager menu: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/3.png'  alt="server manager" title="" /> <br />
2. Click New Rule in the actions menu and select Port in the New Inbound Rule Wizard: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/4.png'  alt="firewall1 " title="" /> <br />
3. Click Next then enter the port number in the Specific local ports box: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/5-1.png'  alt="firewall2" title="" /> <br />
4. Click Next 3 times, then enter a rule Name and click Finish: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/6.png'  alt="firewall3" title="" /></p>

<h2 id="configuringanendpoint">Configuring an Endpoint</h2>

<p>Now we've got the internal firewall setup systems on the internal Azure network can connect but if we want to connect from an external network, we need to configure a public endpoint so the Azure load ballancer will allow connections through. To configure an endpint follow this procedure: <br />
1. In the Azure portal navigate to our VM in the Vitual Machine workspace and cliek on the ENDPOINTS tab, notice we have endpoints already configured for PowerShell and Remote Desktop (which is how we got in there in the first place!). Click add on the toolbar to add a new endpoint: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/7.png'  alt="endpoint1" title="" />
2. Leave the default ADD A STAND-ALONE endpoint as we can't load-balance the database: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/8.png'  alt="endpoint2" title="" />
3. Click the next arrow and fill in the name and ports: <br />
<img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/9.png'  alt="endpoint3" title="" /> <br />
4. Click the tick to finish <br />
5. You should now be able to connect remotely in a browser with a url like this: <br />
<code>http://myvmname.cloudapp.net:7474/browser/</code></p>

<h2 id="configuringacl">Configuring ACL</h2>

<p>Now that we have an endpoint to connect to anybody on the internet can connect, so it's a good idea to lock it down to a selection of individual IP addresses or ranges of IP address using Access Control List. <br />
We'll add ACL rules for individual IP addresses and IP ranges in the following proceedure: <br />
1. Select the endpoint we just created and click MANAGE ACL on the toolbar: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/10.png'  alt="acl1" title="" /> <br />
2. Add a description and enter the single IP address with /32 to lock down the CIDR address to a single IP address range: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/11.png'  alt="acl2" title="" /> <br />
3. Next we'll add a range of addresses, I used this online tool to calculate the CIDR address from my IP range, then added that. <br />
4. Finally add a Deny to all other addresses: <br />
    <img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/12.png'  alt="acl3" title="" /> <br />
5. Test that the browser site still loads (it may take a while for the ACL permissions to take effect) <br />
6. Test that a remote machine outside the permitted list can not see the site. It's important to put the Permit rules before the Deny rules otherwise they will not work.</p>

<h2 id="conclusion">Conclusion</h2>

<p>We've had a good look at the different aspects of what is required to host a Neo4j database on Azure VM and it's been a worthwhile excercise in understanding how Neo4j is installed as a service, how to setup the server and how to configure the VM's Endpoints and ACL.</p>]]></description><link>http://webbercross.azurewebsites.net/manually-installing-a-neo4j-database-on-an-azure-vm/</link><guid isPermaLink="false">5564635f-8a25-4535-8c2c-d17ff201b6a3</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Mon, 21 Jul 2014 11:00:00 GMT</pubDate></item><item><title><![CDATA[Entlib SLAB with MVC5 Website and Azure Trace Listener]]></title><description><![CDATA[<p>If you’ve worked in .Net a while you will no doubt have come across the Microsoft Patterns and Practices Enterprise Libraries (Entlib) for Logging, Data Access, Exception handling etc, which are a set of application blocks to help us build applications with common components. Entlib SLAB (Semantic Logging Application Block) is a new logging block which helps us achieve logging in a more structured way rather than traditional logging libraries such as Entlib Logging and log4net. It allows us to use the .Net 4.5 EventSource class and log events to file, SQL database and Azure Storage tables.</p>

<p>I did research into SLAB in the preperation for the book I'm writing for <a href='http://packtpub.com/' >Packt Publishing</a> called <strong>"Learning Microsoft Azure"</strong> but didn't have space in the diagnostics chapter so I'm writing about it here.</p>

<p>It’s a good idea to do an extra bit of reading on SLAB as it’s quite complicated and we’re going to go through the implementation quite quickly here. There is some good information and examples on the CodePlex site: <a href='http://entlib.codeplex.com/wikipage?title=Entlib6CTPReleaseNotes' >http://entlib.codeplex.com/wikipage?title=Entlib6CTPReleaseNotes</a>. <br />
We’ll implement Entlib SLAB in an MVC5 website, deployed to Azure with Azure table storage for our trace sink in the following procedure:</p>

<p>Install the EnterpriseLibrary.SemanticLogging.WindowsAzure NuGet package with the following command in the Package Manager Console:</p>

<p><code>Install-Package EnterpriseLibrary.SemanticLogging.WindowsAzure</code></p>

<p>Create an EventSource class for the website (I called mine CustomerWebsiteEvents and put it in a folder called Instrumentation). The EventSource class contains EventKeywords, EventTypes and EventOpcodes which are used to decorate explicit event methods for every operation you wish to log (sorry this is bit long-winded but everything is in there, I've taken the indents out to make it slightly easier to read too):</p>

<p>using System; <br />
using System.Diagnostics.Tracing;</p>

<p>namespace AzureBakery.Sales.CustomerWebsite.Instrumentation <br />
{
[EventSource(Name = "CustomerWebsiteEvents")]
public class CustomerWebsiteEvents : EventSource <br />
{
public static class Keywords <br />
{
public const EventKeywords Application = (EventKeywords)1L; <br />
public const EventKeywords DataAccess = (EventKeywords)2L; <br />
public const EventKeywords Controller = (EventKeywords)4L; <br />
}</p>

<p>public static class Tasks <br />
{
// For LogActionFilter
public const EventTask Action = (EventTask)1; <br />
public const EventTask Result = (EventTask)2;</p>

<p>// General
public const EventTask Initialize = (EventTask)6; <br />
public const EventTask GetProducts = (EventTask)7; <br />
public const EventTask AddProductToOrder = (EventTask)8; <br />
}</p>

<p>public static class Opcodes <br />
{
public const EventOpcode Starting = (EventOpcode)20; <br />
    public const EventOpcode Started = (EventOpcode)21; <br />
    public const EventOpcode Error = (EventOpcode)22;</p>

<pre><code>public const EventOpcode Ending = (EventOpcode)23;  
public const EventOpcode Ended = (EventOpcode)23;

public const EventOpcode Executed = (EventOpcode)30;  
public const EventOpcode Executing = (EventOpcode)31;  
}

public static readonly CustomerWebsiteEvents Log = new CustomerWebsiteEvents();

[Event(100, Level = EventLevel.Verbose, Keywords = Keywords.Application, Task = Tasks.Initialize, Opcode = Opcodes.Starting, Version = 1)]
public void ApplicationStarting()  
{
if (this.IsEnabled(EventLevel.Verbose, Keywords.Application))  
{
this.WriteEvent(100);  
}
}

[Event(101, Level = EventLevel.Informational, Keywords = Keywords.Application, Task = Tasks.Initialize, Opcode = Opcodes.Started, Version = 1)]
public void ApplicationStarted()  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Application))  
{
this.WriteEvent(101);  
}
}

[Event(102, Level = EventLevel.Error, Keywords = Keywords.Application, Opcode = Opcodes.Error, Version = 1)]
public void ApplicationError(string exceptionMessage, string exceptionType)  
{
if (this.IsEnabled(EventLevel.Error, Keywords.Application))  
{
this.WriteEvent(102, exceptionMessage, exceptionType);  
}
}

[Event(200, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.Action, Opcode = Opcodes.Executing, Version = 1)]
public void ActionExecuting(string controller, string action)  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(200, controller, action);  
}
}

[Event(201, Level = EventLevel.Verbose, Keywords = Keywords.Controller, Task = Tasks.Action, Opcode = Opcodes.Executed, Version = 1)]
public void ActionExecuted(string controller, string action)  
{
if (this.IsEnabled(EventLevel.Verbose, Keywords.Controller))  
{
this.WriteEvent(201, controller, action);  
}
}

[Event(202, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.Result, Opcode = Opcodes.Executing, Version = 1)]
public void ResultExecuting(string controller, string action)  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(202, controller, action);  
}
}

[Event(203, Level = EventLevel.Verbose, Keywords = Keywords.Controller, Task = Tasks.Result, Opcode = Opcodes.Executed, Version = 1)]
public void ResultExecuted(string controller, string action)  
{
if (this.IsEnabled(EventLevel.Verbose, Keywords.Controller))  
{
this.WriteEvent(203, controller, action);  
}
}        

[Event(200, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.GetProducts, Opcode = Opcodes.Started, Version = 1)]
public void ProductControllerGetProductsStarted()  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(200);  
}
}

[Event(200, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.GetProducts, Opcode = Opcodes.Ended, Version = 1)]
public void ProductControllerGetProductsEnded()  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(200);  
}
}

[Event(200, Level = EventLevel.Error, Keywords = Keywords.Controller, Task = Tasks.GetProducts, Opcode = Opcodes.Error, Version = 1)]
public void ProductControllerGetProductsError(Exception ex)  
{
if (this.IsEnabled(EventLevel.Error, Keywords.Controller))  
{
this.WriteEvent(200, ex);  
}
}

[Event(200, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.AddProductToOrder, Opcode = Opcodes.Started, Version = 1)]
public void ProductControllerAddProductToOrderStarted()  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(200);  
}
}

[Event(200, Level = EventLevel.Informational, Keywords = Keywords.Controller, Task = Tasks.AddProductToOrder, Opcode = Opcodes.Ended, Version = 1)]
public void ProductControllerAddProductToOrderEnded()  
{
if (this.IsEnabled(EventLevel.Informational, Keywords.Controller))  
{
this.WriteEvent(200);  
}
}

[Event(200, Level = EventLevel.Error, Keywords = Keywords.Controller, Task = Tasks.AddProductToOrder, Opcode = Opcodes.Error, Version = 1)]
public void ProductControllerAddProductToOrderError(Exception ex)  
{
if (this.IsEnabled(EventLevel.Error, Keywords.Controller))  
{
this.WriteEvent(200, ex);  
}
}
}
}
</code></pre>

<p>Next we need to modify the Global.asax to setup the listeners (the settings are from appSettings in web.config which we'll cover in a second), I've also added some tracing for the application start and a global errors:</p>

<pre><code>public class MvcApplication : System.Web.HttpApplication  
{
    void Application_Start()
    {
        this.SetupListeners();

        CustomerWebsiteEvents.Log.ApplicationStarting();

        base.Error += MvcApplication_Error;

        AreaRegistration.RegisterAllAreas();
        FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
        RouteConfig.RegisterRoutes(RouteTable.Routes);
        BundleConfig.RegisterBundles(BundleTable.Bundles);

        CustomerWebsiteEvents.Log.ApplicationStarted();
    }

    private void SetupListeners()
    {
    // Get log details from web config (these can be configured in website CONFIGURE tab)
    EventLevel logLevel = (EventLevel)Enum.Parse(typeof(EventLevel), ConfigurationManager.AppSettings["logLevel"]);
    var logAccountName = ConfigurationManager.AppSettings["logAccountName"];
    var logAccountKey = ConfigurationManager.AppSettings["logAccountKey"];
    var logTableName = ConfigurationManager.AppSettings["logTableName"];

    // Create listener and enable events
    var listener = new ObservableEventListener();
        listener.EnableEvents(CustomerWebsiteEvents.Log, logLevel, Keywords.All);

    // Build storage connection string
    var cnString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", logAccountName, logAccountKey);

    // Log listener to azure         listener.LogToWindowsAzureTable("WindowsAzureListener", cnString, logTableName);        
    }

    void MvcApplication_Error(object sender, EventArgs e)
    {
        CustomerWebsiteEvents.Log.ApplicationError(Server.GetLastError());
    }
}
</code></pre>

<p>As I mentioned, I've put all the table logging setting in the web config which can be modified in the portal if needed. Once we have a storage account, we can fill in the details:</p>

<pre><code>&lt;appSettings&gt;  
    &lt;add key="webpages:Version" value="3.0.0.0" /&gt;
    &lt;add key="webpages:Enabled" value="false" /&gt;
    &lt;add key="ClientValidationEnabled" value="true" /&gt;
    &lt;add key="UnobtrusiveJavaScriptEnabled" value="true" /&gt;
    &lt;!-- SLAB --&gt;
    &lt;add key="logAccountName" value="myAccountName" /&gt;
    &lt;add key="logAccountKey" value="xxxxxxxxxxxxxxXXXXXXXXXXMyKey==" /&gt;
    &lt;add key="logTableName" value="SalesCustomerLog" /&gt;
    &lt;add key="logLevel" value="LogAlways" /&gt;
&lt;/appSettings&gt;
</code></pre>

<p>Now, we can instrument our controllers and data access layer, here's an action from a controller:</p>

<pre><code>[Authorize]
public ActionResult AddToOrder(int id)  
{                  CustomerWebsiteEvents.Log.ProductControllerAddProductToOrderStarted();

    OrderItem item = null;

    try
    {
        // Get customer details
        var uid = User.Identity.GetUserId();

        // Add to order
        item = this._uow.AddToOrder(id, uid);

        // Save
        this._uow.SaveChanges();
    }
    catch (Exception ex)
    {            CustomerWebsiteEvents.Log.ProductControllerAddProductToOrderError(ex);

        throw;
    }    
        CustomerWebsiteEvents.Log.ProductControllerAddProductToOrderEnded();

    return RedirectToAction("Index", new { productType = item.Product.ProductType });
}
</code></pre>

<p>Now if we publish the website, then excecise it in the browser and examine the table storage, we should see our application tracing being logged:</p>

<p><img src='http://webbercross.azurewebsites.net/content/images/2014/Sep/slab-1.png'  alt="slab" /></p>

<p>Manually creating methods for numerous controller actions can be quite time consuming so I put together an ActionFilterAttribute to log the controller action life-cycle events:</p>

<pre><code>public class LogActionFilter : ActionFilterAttribute  
{
    public override void OnActionExecuted(ActionExecutedContext filterContext)
    {
        var controllerName = filterContext.RouteData.Values["controller"].ToString();
        var actionName = filterContext.RouteData.Values["action"].ToString();    
            CustomerWebsiteEvents.Log.ActionExecuted(controllerName, actionName);

        base.OnActionExecuted(filterContext);
    }

    public override void OnActionExecuting(ActionExecutingContext filterContext)
    {
        var controllerName = filterContext.RouteData.Values["controller"].ToString();
        var actionName = filterContext.RouteData.Values["action"].ToString();    
            CustomerWebsiteEvents.Log.ActionExecuting(controllerName, actionName);

        base.OnActionExecuting(filterContext);
    }

    public override void OnResultExecuted(ResultExecutedContext filterContext)
    {
        var controllerName = filterContext.RouteData.Values["controller"].ToString();
        var actionName = filterContext.RouteData.Values["action"].ToString();
                CustomerWebsiteEvents.Log.ResultExecuted(controllerName, actionName);

        base.OnResultExecuted(filterContext);
    }

    public override void OnResultExecuting(ResultExecutingContext filterContext)
    {
        var controllerName = filterContext.RouteData.Values["controller"].ToString();
        var actionName = filterContext.RouteData.Values["action"].ToString();    
            CustomerWebsiteEvents.Log.ResultExecuting(controllerName, actionName);

        base.OnResultExecuting(filterContext);
    }
}
</code></pre>

<p>This can easily be added at controller level to achieve logging for all actions like this:</p>

<pre><code>namespace AzureBakery.Sales.CustomerWebsite.Controllers  
{
    [LogActionFilter]
    public class HomeController : Controller
    {
        public ActionResult Index()
        {
            return View();
        }
</code></pre>

<p>Overall using EventSource and SLAB is extremely powerful for creating tailored tracing for websites compared with the standard System.Diagnostics.Trace object. We can create table logs which make it easy to pinpoint exactly the diagnostic information we're looking for by querying the EventId, Keyword, Type, OpCode and Level.</p>]]></description><link>http://webbercross.azurewebsites.net/entlib-slab-with-mvc5-website-and-azure-trace-listener/</link><guid isPermaLink="false">6f22fa7c-c8e0-4800-877d-5f4b54ba68c5</guid><dc:creator><![CDATA[Geoff]]></dc:creator><pubDate>Wed, 21 May 2014 10:17:00 GMT</pubDate></item></channel></rss>