Tags: Active Geo-Replication, data-tier application, Transparent Data Encryption
One of the leading medical software providers maintained their complex healthcare system with thousands of unique users. Developing over time, this system became overloaded and unsafe. So our customer contacted us to update their system in order to meet the HIPAA standards. After thorough research, we came up with a decision to migrate their databases from SQL Server 2008 to Azure SQL Database.
As a result, the customer benefited tremendously from enabling modern high availability and disaster recovery facilities, with better visibility and control of their cost.
Check how we have migrated the custom databases to Azure environment in the following video.
Original customer problem
Our customer produces and maintains clinical guidelines and other medical software used by more than 2000 clients. Each client had used his own copy of the application, that was connected to the corresponding database, including the old versions. So our customer had to support a large number of databases.
Initially, the customer wanted to upgrade their SQL Server 2008 environment to SQL Server 2014. They wanted to align with Microsoft’s SQL Server supportability roadmap along with the ability to use Always On Availability Groups, Active Geo-Replication, and Transparent Data Encryption.
During comprehensive analysis of the problem, we faced several issues. If SQL Server 2014 migration took place, those issues would relate to the number of databases and Availability Groups limitations. Instead, we came to the conclusion to migrate the customer on-premise databases from SQL Server 2008 to Azure SQL Database cloud platform.
It turned out that user databases contained procedures, functions, assemblies and views that could not be directly imported into Azure SQL database. To overcome this incompatibility, we decided to make the non-breaking code changes in the pre-cleanup phase. We performed all breaking changes on the fly during the migration process.
During the database migration process we had to meet the following customer requirements:
- Read-Only access on a secondary replica of an Always On availability group;
- Single logical server per environment;
- Logical server access only from customer’s environment;
- Disable the Allow access to Azure Services;
- Rapid deployment.
We decided to utilize a data-tier application (DAC) to encapsulate a database’s schema and data during the migration process. We used DACFx framework to package source SQL Server objects into a DacPac files. We employed command line utility SqlPackage.exe for this operation.
We have been running all migration processes from internal customer environments and did not use any Azure migration services. Also, we employed automatic error recovery.
Let’s take a look at high-level migration steps:
- At the preliminary stage, we cleaned SQL Server 2008 source code of invalid objects. Besides, we removed the errors that were preventing SQLPackage.exe from creating the initial DacPac files;
- Then we extracted on-premise SQL Server 2008 databases schema and data using DacPac files. This type of files is designed to create a copy of original database during the migration process. At this step, we found false negative errors, reported by DACFx framework. We suppressed them to be able to accomplish export;
- Next, we adjusted DacPac files to make the database schemas Azure-ready. We rewrote CLR functions and procedures in accordance with their T-SQL equivalents and removed CLR assemblies to reach this goal. We changed login based users to contained database users since contained databases don’t have any external dependencies and thus can be moved easily. We finished this stage by dropping SQL users based on Windows users. Then we revoked the guest user permission to access the database;
- Then we could use the finalized DacPac files to create the new databases in the Azure cloud platform. To automate the migration of a large number of databases, we created the PowerShell script. It provided simultaneous migration of 20 databases;
- Once the schema and data migration was finished, we could move to the testing stage. We validated that no unexpected schema changes were introduced and there were no changes to any data values. We used Database Compare Suite tool to compare created schema objects and migrated data.
Conclusions and benefits
In the end, we adjusted Azure configuration to ensure full integration with the migrated code. We configured Active Geo-Replication to implement disaster recovery and Transparent Data Encryption to prevent the malicious attacks. Finally, we reconnected the users’ applications to the new Azure databases.
Summing up the customer benefits, we performed consistent migration, enabling our customer to get the up-to-date and stable platform along with their aging hardware decommission and improvement of hosting operation.
Feel free to contact us to perform your cloud migration project.