Is this a valid, and possible, Azure QA Environment?
I am trying to formulate a valid dev/test/QA environment for my company's suite of applications to migrate to Azure. However, I am enforcing the constraint on us that our dev/test/qa/etc. environments are actually hosted on-premise and deployed to via a build server (such as CC.NET, TeamCity, Jenkins, etc.).
In such a "Test Environment", we need the ability to trigger deployments of a specific snapshot of unreleased code (and data) for a team of QA and Business professionals to test for both technical testing and for business acceptance testing. Clearly all of these folks won't be compiling and hitting F5 in Visual Studio to do this testing, so we need an environment to deploy to. In our SDLC, we actually go through ~4 such environments before we get to开发者_如何学C staging and production. In short, we need a low-overhead (automated deployments) and easily-reproducible process for this.
In planning out this environment, the question of "How to host the Azure services" is clearly the hard part. So let's look at each part of Azure. The italicized options are the ones we are wanting to go with.
- Web Roles Well, IIS can more-or-less handle these for us (at least enough for dev/test situations - all but real load testing which clearly we'd have to do in Azure, which is fine).
- Worker Roles We have two options here, it seems. The first is to have a "wrapper app" of sorts that is a Windows Service and we can use that to host the DLLs that our Worker Roles call for functionality (after all, our true Worker Role project shouldn't be anything more than a config file and ~4 lines of code that calls a DLL to do work). That option works, but requires some very different application code and deployment code management/maintenance. The second option is to use the Azure Compute Emulator. This works as long as your Worker Roles do not need to monitor external ports or anything. In our case, our Worker Roles only need to monitor Queues, and subsequently access various resources. The problem(s) with this revolve around the differing build scripts because the only way to automate a deployment to the Azure Emulator is to run
CSPack
andCSRun
on the machine hosting the Azure Emultor, which probably isn't your build server. Because of this, you'll have to do some sort of remote scripting to accomplish this. - VM Roles We don't really care about these, so I'm totally ignoring this aspect of testing.
- Queues Here, we have 3 options. The first is to use MSMQ. Because this requires an entirely different codebase that we don't have (or at least abstraction around that different codebase), I'm not considering this option. The second is to keep Queues in Azure since they're so small/cheap. We're actually temporarily doing this until we can try the third option. The third option is to use the Azure Storage Emulator. I'm not sure but I believe this option will only allow services running on the local machine to access the storage objects. For Queues, our application code is the code that actually "deploys" the queues, so that should be fine as long as our application code is actually running on the server hosting the Azure Storage Emulator.
- Tables Here, we have 3 options. The first is one I hate and that is to use a database and create a table in it to access for these tables. I'm not considering that option. The second is to keep the tables in Azure. I don't like this because that's a lot of back-and-forth for things that may store data of significant size (up to 1MB per record). While Queues are incredibly lightweight and cheap, table cost can add up pretty quickly. This leaves us to the third option, using the Azure Storage Emulator. I'm not sure but I believe this option will only allow services running on the local machine to access the storage objects. I still don't understand all of the pros/cons tables in the Emulator.
- Blobs Here, we have really 2 options. The first is a bad one, and that's keeping them in Azure. These are most likely files of significant size, so that's unwise. So the second option, once again, is to use the Azure Storage Emulator. I think this is what we need to do.
So given that we have MVC Apps (web role), WCF web services (web role), Queues, Tables, Blobs, and Worker Roles that are triggered by queues but access tables, blobs, and WCF web services, does this seem like a reasonable way to host our internal QA (and like) environments? And other than some annoyances with remote scripting CSPack and CSRun to deploy to the Azure emulator, does this all sound reasonably automatable with a build server?
IMHO: You're jumping thru too many hoops to not have to deploy to Azure for Dev & QA environments.. why not just deploy and test out your deployment scripts in the same time? Use xtra-small instances to keep costs low.
Emulation of storage is not /that/ great at all. There are many minor differences that will make your testing unreliable. Nor are you testing load balancing - something that would find problems with any unplanned for session-state
One of the most useful advice about Azure I got is "Test on Azure as soon as possible". It will help you address the differences between real Azure and emulator as early as possible in your development life cycle, and there are many.
Secondly, your alternative solutions sound like a lot of work only to have testing environments. I think hosting on Azure would be more cost effective. And in the end, you will still have to test on Azure before releasing your product.
精彩评论