Observations from a Full-Scale Migration to Windows Azure, Part 1 (Highlights)


Over the past several years, we have been designing and developing our systems in preparation of getting them up “into the cloud”. Whether this means Microsoft, Amazon, or whomever was unimportant as the architecture needed to allow for high-availability and load-balanced deployments of our systems – the cloud-specific issues could be figured out later. About 1 1/2 years ago, we deployed some minor systems to Azure and consumed some of their services (most importantly queueing and blob storage). Over the past month and a half, we’ve been making changes specific to Azure. And last weekend, a co-worker of mine (who I can’t express enough gratitude towards) and I spent a grueling 72 hours beginning Friday morning migrating all of our databases and systems to Azure. We learned a lot through our various successes and failures during this migration, and in the time leading up to it.

For our system, we have a single set of internal WCF services hitting the database and half a dozen internal applications hitting those internal services. One of those internal applications is a set of externally-accessible WCF services, and on our customers’ behalf, we have some custom applications consuming those “public” services. Technologies/systems that we employ include the following:

  • SQL Server (50GB database where 23GB exists in a single table)
  • SQL Server Reporting Services
  • SQL Server Analysis Services
  • SQL Server Integration Services
  • WCF
  • .NET 4.0/MVC 2.0/Visual Studio 2010
  • Claims-Based Authentication (via ADFS)
  • Active Directory
  • Probably some more that I’m forgetting. If they’re important, I’ll add them back here.

By the end of the weekend, we had successfully migrated all critical systems to Azure (that we planned to) and only a couple non-critical apps still needed migration. We (temporarily) pulled the plug on one of our non-critical applications, in part due to migration difficulties and in part due to a pre-existing bug that needs fixed in it ASAP, so we decided to just tackle both at once the following week after getting some sleep. I can’t say the migration went without a hitch. While we had some unexpected major victories on some high-risk areas, we also had some unexpected major problems in some low-risk areas.

I’ll go over some specific experiences in some follow-up posts, but here were some major key points we took away from this experience that might help others. Some of these we knew about in advance and were prepared to deal with them. Others caught us by surprise and caused problems for our migration

  1. If you have a large database (anything more than 5GB), do a LOT of testing before you start migration! Backups, dropping/recreating indexes on large tables, etc! For instance, we have one table that we can’t drop and recreate an index on and the default ways to create backups take 8-10 hours for our database!
  2. When migrating your database to Azure, don’t do it with your “base system” being locally. Upload a .bak backup file to Azure blob storage using a tool like Cerabata’s Cloud Storage Studio (which allows you to upload in small chunks to easily recover from errors and improve bandwidth speeds) and create a medium-sized Azure Virtual Machine with a SQL Server Evaluation image, and base all of your data migration work from there. You’ll save so much time doing it this way unless you get everything working perfectly your very first try (unlikely). Otherwise, for just a couple bucks (it literally cost us ~$2 for the entire weekend’s worth of VMs we used), it’s totally worth it!
  3. AUTOMATION!! Automation, automation, AUTOMATION! You do the same thing over and over and over so many times, really, have a solid build server with automated build scripts for doing this! Do NOT use Visual Studio or any manual process! The ROI on investing in a build server will pay off before your 5th deployment, most likely, regardless of how complex or simple your system is!
  4. No heaps! You must have a primary key/clustered index on every single table. No exceptions! Period! Exclamation mark!
  5. Getting Data Sync up and running is a major pain in the ass! Azure’s Data Sync has stricter limitations than SQL Server’s Data Sync (for instance, computed columns don’t play nicely at all in Azure but SQL Server has no problem with them). There are just enough nuances and so much time that it takes to find them that this you can spend quite a bit of time just figuring this out. And then figuring how to automate these nuances is yet another topic of discussion since the tools are so poor right now.
  6. Use the SQL Database Migration Wizard to migrate your data from your “base system” to an Azure database. But be gentle with it, it likes to crash and that’s painful when it happens 3 hours into the process! Also, realize that it turns nullable booleans with a NULL value into FALSE and doesn’t play nicely with some special characters, so be prepared to deal with these nuances!
  7. Red Gate SQL Compare and SQL Data Compare are GREAT tools to help you make sure your database is properly migrated! SQL Data Compare fixes up the problems from the SQL Database Migration Wizard very nicely and SQL Compare gives you reassurance that indexes, foreign keys, etc. are all migrated nicely.
  8. As I said before, test test test with your database! For us, 8-10 hour database backups were unacceptable. Our current solution for this problem is to use Red Gate’s Cloud Services Azure Backup service. With the non-transactionally-consistent backup option, we can get it to run in ~2 hours. Since we can have nightly maintenance windows, this works for us.
  9. Plan on migrating to MVC 3.0 if you want to run in Server 2012 instances.
  10. If you’re changing opened endpoints in the Azure configuration (i.e. opening/closing holes in firewalls), you have to delete the entire deployment (not service) and deploy again. Deploying over an existing deployment won’t work but also won’t give you any errors. Several hours were wasted here!
  11. MiniProfiler is pretty awesome! But the awesomeness stops and becomes very confusing if you have more than 1 instance of anything! Perhaps there’s a fix for this but we haven’t yet found one.
  12. If you have more than just one production environment, it’s very handy to have different subscriptions to help you keep things organized! Use one subscription for Dev/QA/etc, one for Production, one for Demo, one for that really big customer who wants their own dedicated servers, etc. Your business folks will also appreciate this as it breaks billing up into those same groups. Money people like that.
  13. Extra Small instances are dirt cheap and can be quite handy! But don’t force things in there that won’t fit. We found that, with our SOA, Extra Small instances were sufficient for everything except for two of our roles. Except for those two roles, we actually get much better performance with 7 (or fewer) Extra Small instances than 2 Small instances for a cheaper price (1 Small costs the same as 6 Extra Small).

In the next post, we’ll go over the things that we did leading up to this migration to prepare for everything. From system architecture to avoiding SessionState like the plague and retry logic in our DAL, we’ll cover the things that we did to help (or we thought would help) make this an easier migration. And I will also highlight the things we didn’t do that I wish we had done!



Setting up an ASP.NET Service Account with Least-Privilege Permissions


I find myself rediscovering how to do this a lot, so I thought I’d post this here.

  1. Create your Account (let’s say it’s MyDomain.com\sa_MyMVCHostingUser in your company’s Active Directory server – it can also be a local Windows account as well)
  2. Open up an elevated command prompt (Start -> “command” -> [CTRL]+[SHIFT]+[ENTER])
  3. Navigate to your .NET Framework directory (such as C:\Windows\Microsoft.NET\Framework\v4.0.30319)
  4. execute: aspnet_regiis -ga MyDomain.com\sa_MyMVCHostingUser

After performing the above steps, your account will have the basic permissions to host a basic ASP.NET application. If you are accessing resources other than the IIS Metabase or content files in your IIS Application, then those permission configurations are beyond the scope of this post (and you would NOT set them up in a similar way, so don’t try).



SOA-based Architecture in progress


I’ve been tasked at coming up with a standard architecture for us to apply to our future projects. Yeah, I know, there’s no blanket answer for everything, but given the requirements I expect, I think a single architecture can handle 95% of what we’ll be doing and we can deviate/improve as necessary for the other 5%.

Ultimately I don’t yet know what this is going to look like but this is the direction I’m leaning in:
(more…)



Azure’s Service Bus and EnergyNet (PDC Day 0)


On Day 0 of Microsoft PDC, I attended the Software in the Energy Economy workshop. Much to my surprise (and disappointment), we didn’t talk about energy for the entire first half of the workshop. Instead, it was about Azure’s Service Bus. BAD Microsoft!! It was explained to us by Juval Lowy that he wanted to do it entirely on energy but Microsoft forced him to have half of the talk on the service bus. Now I can understand this from Microsoft, but this should have been clear to the people there. Honestly, I would have much preferred to go to another workshop than to learn about the Azure Service Bus. Cool stuff but my current analysis of it is that it’s way too unreliable (I don’t mean bugs, I mean lack of transactional support) and it is simply missing some of the things that I would want/need in such a system, like queues! There are hacks to implement them, but I don’t want to have such an important foundational part of my architectures built on hacks! Anyways, that can be for another day (gotta get to the keynote).

(more…)



Rendering a Custom ASP.NET Control when Disabled by Parent Container


So I was performing some maintenance work on some webform stuff in an application and ran into a problem where an existing custom control, which I have the source for so I can fix it (yay!), wasn’t properly disabling itself when it was in a container that became disabled. The way it works, it overrides the rendering process and spits out lots of HTML and javascript (eww!) but for the important things for this rendering, it looks at a custom “ReadOnly” property on the control to enable/disable the appropriate things. So essentially the control is always enabled except when that flag is set to false – a bad idea!

(more…)



Going to PDC 2009


Yesterday I got the thumbs up that I’m being sent to PDC 2009 by my company! I planned on going whether I was sent or not but it’s nice to not have to foot the bill out of my own pocket this year. In 2005 I was lucky enough that my company covered airfare and hotel while Telerik was generous enough to cover my registration for the conference and the pre-conference. Last year I wasn’t so lucky and had to cover almost all of it out of pocket but that’s what you expect as a contractor.

Here’s what I have to do:
(more…)



NAnt – Detect 64-bit or 32-bit OS


I was working with Cody Collins and we ran into a problem recently with detecting whether the OS running was 32-bit or 64-bit from within NAnt. We’re trying to automate the installation of some software that has separate installers for 32-bit and 64-bit and we need to determine which installer to run from NAnt.

The problem begins with NAnt being compiled for 32-bit mode only which means all 64-bit functionality is transparent to it. If it weren’t for that, then we could simply depend on the PROCESSOR_ARCHITECTURE environment variable. So if you try to ask the OS if it’s 64-bit, it will tell you that it isn’t. Luckily there is an IsWow64Process WinAPI call that you can make to determine if you are running in WoW64. From these two pieces of information, you can infer whether or not the OS is 64-bit.

Cody and I were able to come up with the following scripts to determine this.

Note: This runs unmanaged code and does not protect you from crashes there – this could be better but this should get you 90% of the way there. This has been tested on Windows XP (32-bit), Windows 2003 (64-bit), Windows Vista (32-bit and 64-bit), Windows 2008 (32-bit), and Windows 7 RC (64-bit). Not an exhaustive test but it covers many of the bases.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
<property name="Is64BitOperatingSystem" value="false" />
<property name="Is64BitProcess" value="false" />
<property name="IsWow64Process" value="false" />
 
<target name="DetectOperatingSystemArchitecture" depends="DetectIfWow64Process,DetectIf64BitProcess">
	<description>
		This will detect whether the current Operating System is running as a 32-bit or 64-bit Operating System regardless of whether this is a 32-bit or 64-bit process.
	</description>
	<property name="Is64BitOperatingSystem" value="${IsWow64Process or Is64BitProcess}" />
 
	<choose>
		<when test="${Is64BitOperatingSystem}">
			<echo message="The operating system you are running is 64-bit." />
		</when>
		<otherwise>
			<echo message="The operating system you are running is 32-bit." />
		</otherwise>
	</choose>
</target>
 
<script language="C#" prefix="MyWin32Calls">
	< code>
		< ![CDATA[
			[System.Runtime.InteropServices.DllImport("kernel32.dll")]
			public static extern bool IsWow64Process(System.IntPtr hProcess, out bool lpSystemInfo);
 
			[Function("IsWow64Process")]
			public bool IsWow64Process()
			{
				bool retVal = false;
 
				IsWow64Process(System.Diagnostics.Process.GetCurrentProcess().Handle, out retVal);
 
				return retVal;
			}
			]]>
	< /code>
</script>
 
<target name="DetectIfWow64Process">
	<description>
		Detects whether we are currently in a WoW64 process or not.
	</description>
 
	<property name="IsWow64Process" value="${MyWin32Calls::IsWow64Process()}" />
	<echo message="Setting the [IsWow64Process] property to ${IsWow64Process}." />
</target>
 
<target name="DetectIf64BitProcess">
	<description>
		Detects whether we are currently in a 32-bit or 64-bit process (not necessarily what the OS is running). Note that as of the time of this writing, this will ALWAYS return false because NAnt is compiled to run in 32-bit mode only.
	</description>
 
	<!-- This can return x86, x64, AMD64, or IA64 as of the time of this writing. This works for a 32-bit process in a 64-bit OS because the OS makes the 64-bitness transparent to the process in this environment variable. -->
	<property name="Is64BitProcess" value="${environment::get-variable('PROCESSOR_ARCHITECTURE')!='x86'}" />
	<echo message="Setting the [Is64BitProcess] property to ${Is64BitProcess}." />
</target>

On a 64-bit OS, it has the following output:

D:\bin\deleteme\nanttest>build DetectOperatingSystemArchitecture
NAnt 0.85 (Build 0.85.2344.0; rc4; 6/2/2006)
Copyright (C) 2001-2006 Gerry Shaw
http://nant.sourceforge.net
 
Buildfile: file:///D:/bin/deleteme/nanttest/test.build
Target framework: Microsoft .NET Framework 2.0
Target(s) specified: DetectOperatingSystemArchitecture
 
   [script] Scanning assembly "lsbw4oxa" for extensions.
 
DetectIfWow64Process:
 
     [echo] Setting the [IsWow64Process] property to True.
 
DetectIf64BitProcess:
 
     [echo] Setting the [Is64BitProcess] property to False.
 
DetectOperatingSystemArchitecture:
 
     [echo] The operating system you are running is 64-bit.
 
BUILD SUCCEEDED
 
Total time: 0.2 seconds.
 
 
D:\bin\deleteme\nanttest>

Happy NAnting!!!

*heads off to the IndyALT.NET meeting on Continuous Integration now…*



IndyTechFest registration is now open!


IndyTechFest registration is now open! This year there is a limit of 500 registrations (I believe last year’s was like 400 and it was booked up within just a couple weeks). So I strongly encourage you to register sooner rather than later!

There is a great lineup of speakers and sessions at this year’s IndyTechFest! Some of the speakers I have seen speak before include Paul Hacker, Larry Clarkin, Michael Eaton, Arie Jones, Tom Pizzato, Dan Rigsby, and Bill Steele. There are many other great speakers that I know or have heard of. This should be an excellent event and one that is worth a good long drive to get to!

Some of the sessions that I’m really looking forwards to include Test Driven Development (TDD) w/ VS 2008, Tips and Tricks for the New C#, Tips and Tricks for the New VB .NET, Duplexing WCF in the Enterprise, and Virtualization of SQL Server. There are many other sessions that I’m going to hope to get to but alas, with it being a one-day event, I doubt I’ll get to most of the ones I really want to see. 😛

Props to the people who worked hard to make this event possible, including Brad Jones, Dave Leininger, John Magnabosco, Mark McClellan, and Bob Walker, as well as all of the support of the local user groups to help drive the event!

Just as I was wrapping this post up, I received a phone call. Apparently as of 1pm (1 hour after registration opened), nearly HALF of all available registration slots were filled! If you read this post and have not registered, go register NOW and don’t wait or you’ll be left out!



Indy Invades CinArc!!


On Tuesday (July 8, 2008) evening, Sasha Kotlyar, Dean Weber, and I made a spontaneous trip to Cincinnati to check out the CinArc group (not to be confused with this CinARC). This group is Cincinnati’s Architecture User Group and seems to be mostly .NET-based. They are a very new group as this was only their second meeting. They meet monthly on the second Tuesday of the month. Their current meeting-format is that of a fishbowl meeting. You can read more on Wikipedia about this here.

I must say, the three of us Hoosiers really enjoyed ourselves at CinArc! Despite the downpours, rush-hour construction, and construction barrels we had to dodge in the middle of the road, it was great! Oh, and I won a door prize as well! I walked out with a VS2008 Pro license (only had MSDN-based licenses before, now I have a permanent license!). The group is led by Mike Wood, who also happens to lead the Cincinnati .NET User Group. Lots of other people were also in attendance (I’m not even going to attempt to name them because I’m horrible with names and I’ll surely forget some of them but turns out I follow lots of them on Twiter). There was a total of 19 people there with 5 chairs in the middle of the fishbowl (1 moderator, 3 speakers, 1 open). It was great that they veered away from the norm where it was a very interactive discussion and almost everybody participated in it.

The agenda for the meeting was different than what I’ve been used to for user group meetings, and I really liked it! I’m used to food before hand, kicking things off with announcements, then going into the discussion for the rest of the night, and door prizes at the end. What they did, instead, was kick things off with the discussion, about an hour into it take a break for food, kick the second half off with announcements, go back into the discussion after the nice little break, and then door prizes at the end. The trick to accomplishing this is timing on the food and if it can be pulled off, I may actually try to assimilate this style into the ALT.NET meetings! However, one important part of the ALT.NET meetings, I feel, is the social time spent before the meeting. Perhaps we can have snacks and drinks available then and real food available at “Halftime”.

One other thing that was really neat was that the meeting attendees were able to choose the topics to discuss. Ideas were put up on a whiteboard as recommended by the people there, and then everybody voted on the ones that they were most interested in participating in discussion with. There were 3 topics that seemed to be the most popular, and it turned out that we had time to discuss 3 topics. So it worked out perfectly!

As I said before, I had a great time at CinArc and I highly recommend it to anybody who is in the area and is interested in best architecture practices and bouncing ideas off of one another! There were some extremely intelligent guys at this group and it’s great that they are trying to expand knowledge in the community and help one another! I can already see this is going to be a very popular user group in the future! I hope some of those guys come visit some of our Indy events, and I just may try and pick up some Reds tickets some second Tuesday of the month afternoon so I have a good excuse to be in town again for another CinArc meeting! 🙂

-Shane



Speed up .NET Compact Framework compile time by a LOT!


I was complaining on Twitter about the long time it takes to compile a Compact Framework application and Steve Schoon informed me of a hack you can do to DRASTICALLY speed up the time it takes to compile! You can get the details here.

Read the entire thing so you are aware of what you’re disabling by performing this hack but unless you are constantly flipping back and forth to various different target platforms with your mobile development, you shouldn’t need this feature very often at all! It dropped our compile times down from about 3 minutes to about 10 seconds. It was amazing!

-Shane

Next Page »

Jaxidian Update is proudly powered by WordPress and themed by Mukkamu