Moving Databases Between SQL Azure Servers

This week I had to move a couple of databases between two SQL Azure accounts. I needed to decommission one account and thus get the important stuff out before nuking it. My goal was straight forward: move two databases from one SQL Azure server (and account) to another SQL Azure server (and account). The applications have a fairly high tolerance for downtime which meant I didn’t have to concern myself with maintaining continuity.

For the schema I had two options to chose from: script the entire database (using Management Studio) or extract the database as a .dacpac. For the data I also had two options: include the data as part of the script or use the Import/Export Wizard to copy the data. As a side note, I always thought this tool was named backwards – shouldn’t it be the Export/Import tool?

I opted to go the .dacpac route for two simple reasons: first, I wanted to get the schema moved over first and validate it before moving the data and second, I wanted to have the schema in a portable format that wasn’t easily altered. Think of this as a way of preserving the integrity of the schema. Transact-SQL scripts are too easy to change without warning.

I connected Management Studio to the server being decommissioned and from each database I created a .dacpac – I did this by right-clicking the database in Object Explorer, selecting Tasks –> Extract Data-tier Application… I followed the prompts in the wizard, accepting all of the defaults. Neither of my schema are very complex so the process was extremely fast for both.

Once I had the two .dacpacs on my local machine I connected Management Studio to the new server. I expanded the Management node, right-clicked on Data-tier Applications and selected Deploy Data-tier Application. This launched the deployment wizard. I followed the wizard, accepting the defaults. I repeated this for the second database.

Now that I had the schema moved over, and since I used the Data-tier Application functionality I had confidence everything moved correctly – because I didn’t receive any errors! It’s time to move the data.

I opted to use the Import/Export wizard for this. It was a simple and straight forward. I launched the wizard, pointed to the original database, pointed to the new database, selected my tables and let ‘er rip! It was fast (neither database is very big) and painless. One thing to keep in mind when doing this is it’s not performing a server to server copy; it brings the data locally and then pushes it back up to the cloud.

The final step was to re-enable the logins. For security reasons passwords are not stored in the .dacpac. When SQL logins are created during deployment, each SQL login is created as a disabled login with a strong random password and the MUST_CHANGE clause. Thereafter, users must enable and change the password for each new SQL login (using the ALTER LOGIN command, for example). I quick little Transact-SQL and my logins are back in business.

The entire process took me about 15 minutes to complete (remember my databases are relatively small) – it was awesome!

Every time I use SQL Azure I walk away with a smile on my face. It’s extremely cool that I have all this capability at my finger tips and I don’t have to worry about managing a service, applying patches, etc.

10/13/2010: Correction: There is a systematic way to move databases between servers. This blog post explains how to create a new database as a copy of another database.

The Transact-SQL is as follows:
CREATE DATABASE destination_database_name
AS COPY OF [source_server_name.]source_database_name

Keep in mind that your username and password must be the same on the target and source server and you must have CREATE DATABASE permissions on the target server and you must have permissions to the database on the source server. Since the DAC registration meta-data resides in MASTER it won't be copied over. After the database copy you can connect to the new database and register it as a Data-tier Application.