When Written: April 2011
In the previous article I showed how to get started with Microsoft’s Azure and create your first Web Application in the Cloud. This application was nothing more than a basic web site and to really achieve anything with most applications you will soon want to be accessing data held in a database so I thought it would be useful to cover the setting up of a database in a Cloud next. I was quite surprised to find just how much did not work and the number of limitations to using Azure as your database. Whilst some of these limitations have workarounds and others would probably have only a minor impact on your project, you still need to check to see what the implications might be for your application. Two major omissions currently are the lack of Full Text indexing and no SQL Agent. Microsoft also suggests you upgrade your local database to SQL Server 2008 and then migrate to Azure but I decided to take a legacy database that was originally developed in SQL 2000 and currently runs on SQL 2005.
This first stage is to log into the SQL Azure management screen on the web at https://windows.azure.com/default.aspx. On this screen you can create a database, assign whether it will be a 1G or a 5G database – the difference apart from the obvious amount of storage is the cost to you, 1G is currently $9.99 whilst 5 G is $49.99 a month plus data transfer charges. After setting the usual admin account login details an important stage is to create the firewall rules. This is the firewall into the Microsoft Cloud and you need to tell it to allow your external IP address in otherwise a lot of your SQL desktop management tools will not work. With your database created the next stage is to build the tables. This you can do manually via the Silverlight web based admin screen provided on the admin site. However this is a tedious process and you have probably already prototyped your database locally in SQL Server on your machine so building your Azure database tables via scripts is the sensible way to go. Using scripts is a better way of creating these tables in Azure, and rather than manually crafting them it is easier to create them using the desktop SQL Management Studio.
Issues with indexes
Before you do this however make sure of two very important issues. One is that your tables all have key index fields and that every table has at least one cascaded index. This requirement is a good one to enforce from a performance issue but you may well find some of your legacy databases will need modifying; a simple job and the new index should not affect your application. I ran some generated SQL code against Azure and found that the generated code would not run and I had to manually tweak it. For example before every CREATE VIEW or CREATE PROCEDURE you will need a GO command to ‘fool’ the database that these sections of the script are at the beginning of the script. Another feature of index creation that is not supported in Azure is ‘FILLFACTOR’ which is used to optimised the index page size for performance, so you will have to remove any reference to this. You cannot also reference another database within your query so the syntax ‘anotherdb.table.field’ is not allowed. This last limitation could cause a major rewrite of your application; it certainly would do in the application I am using in these articles.
To connect to your Azure database there are a couple of tricks that you need to keep in mind. When you first create the database a name will be automatically generated, note this name as you will obviously need this to connect to it later. It will be something like “r7ht56sjju7.database.windows.net” and you will have set an admin name and password when you created the database.Assuming you chose “Azureadmin” as this user then to connect to your Azure database either in your code or using any of the range of tools available from Visual Studio to the basic script runners you will need to log in with “Azureadmin@ r7ht56sjju7”. This syntax cause me a little hiatus at first. After having battled with generating scripts and manually hacking them to create the Database schemas I now needed to get some data into it as I certainly wasn’t going to enter test data manually via the Silverlight administration screen, so I downloaded the developer edition of SQL Server 2008 R2 so that I would have access to the latest tools and in particular the SQL Import and Export Wizard. This particular tool turned out to be a big disappointment. To connect to Azure you need to use the .NET Framework Data Provider for SQL server rather than the more usual SQL native client but even after I had discovered this and fixed all the issues with the database indexes, as soon as I tried to transfer more than a small amount of data the wizard would time out and increasing the time out parameter made little difference. A little more searching revealed an upload utility at http://sqlazuremw.codeplex.com called the SQL Azure Migration Wizard which in the words of that varnish “does exactly what it says on the tin”. All the business with creating correctly formatted scripts and transferring data is done for you and it works perfectly, so with this tool you can have a copy of your local database hosted with all the data out in the Cloud within minutes. One little ‘gotcha’ ( there always seems to be one doesn’t there? ) is that you need to click on the ‘advanced’ button and select to create ‘DROP’ as well as your ‘CREATE’ scripts to make sure that any old tables are removed before new ones are created otherwise the import could fail if your tables don’t have a clustered index. The scripts generated are available for you to save for examination later so you can see exactly what the application thought was best, just in case the changes will impact on your application later. It is a shame that Microsoft can’t write a utility like this but that we have to rely on the community to come up with something that works.
While we are on the subject of lame Microsoft tools for Azure, I was simply stunned at SQL Management Studio’s support of Azure. Sure you can connect to your Azure database with it just as you would any other SQL database on your network, but neither can you modify the design of tables nor examine the data in them except via writing queries, rendering this tool for Azure about as useful as a chocolate teapot! With a collection of tables, views and stored procedures generated and populated with data our database is ready to use, and couple of simple SELECT queries revealed that all seemed to be well.
Article by: Mark Newton
Published in: Mark Newton