When it comes to database migrations, application conversion/remediation along with functional testing of the entire system account for over 50% of the total effort. By using a proven CI/CD process, we’ve been able to reduce the overall migration time and create repeatable processes for on-going changes after the migration. In this blog post, we’ll talk about how we helped a large multinational software company with ambitious growth plans migrate their Oracle Database and Java solution to Amazon Aurora PostgreSQL. What used to take 54 hours to deploy new releases before, only takes 4.5 hours with their new CI/CD process! Technologies we used included Apache Maven, TestNG, OpenJDK, AWS Schema Conversion Tool (SCT), Apache Groovy, Jenkins, Git, and Ansible playbooks.
Background on the original customer’s system
Our customer needed to modernize their Oracle Database to PostgreSQL on AWS to gain flexibility and future requirements for scale. As a part of this massive project, the DB Best team was asked to help to upgrade their data-driven applications for deployment of their applications and converted database on AWS. In addition, help them set up a DevOps process from scratch.
The original customer’s system had a relatively small on-premises Oracle Database with approximately 100 tables. Based on our prior engagement, we agreed that Amazon Aurora PostgreSQL would be the best migration target based on the following criteria:
- Reduced operating costs. No more expensive Oracle Database 11g Enterprise Edition licenses by leveraging the AWS Database Freedom program.
- Reasonable migration costs. We used the AWS Schema Conversion Tool (SCT) and the AWS Workload Qualification Framework (WQF) to develop the migration plan.
Also, our customer maintained approximately 10 complex database applications built on aging versions of Java (7, 8). We needed to deploy these applications to Amazon EC2.
Our customer had a big development team supporting lots of products working on-premises and in the cloud and they would often run short of specifications for their legacy applications. Additionally, their team lacked reasonable quality control, which is a must-have for such a complex project. According to best practices, they needed to support multiple environments including development, integration, staging, and production.
The project requirements included:
- Build DevOps processes for multiple applications and databases.
- Develop CI/CD scripts for multiple environments.
- Create scripts to set up the AWS infrastructure: routing, load balancer, availability groups.
- Finally, making sure the time needed to deployment time fit within the 54-hour weekend downtime window.
The general approach to the project
First, our team obtained access rights to the customer’s Git and Jenkins. We leveraged Git for version control and Jenkins as the DevOps orchestration tool.
Our customer had an established deployment process for their production on-premises environment. However, we discovered it required manual processes and was not used for the development environment. Because the DB Best team needed to add support for the new cloud-based system, we created a new fully automated CI/CD solution from scratch. This process allows for automating 100% of the deployment activities.
Initially, our customer didn’t use the source control for the Oracle Database code. They applied all the schema and code updates directly in the Oracle Database. Setting up the source control for the database code is one of the DevOps best practices. So, the DB Best team implemented the Liquibase database change control tool and stored the code in a Git repository. As a result, the entire PostgreSQL database schema can be created from scratch in a couple of clicks.
Then, we converted the customer’s Oracle Database code to PostgreSQL using the AWS Schema Conversion Tool. We then deployed their new database to Amazon Aurora PostgreSQL. To complete the database migration we needed to update their data-driven applications.
Remediation of data-driven applications
Usually, to accomplish this challenge, we perform the following actions:
- Extract the SQL code from the application
- Convert the extracted code to ensure compatibility with the new database platform
- Upload the converted code back into the application
The DB Best team utilized Apache Maven as a build automation tool for the customer’s Java web applications. Additionally, we implemented unit testing based on TestNG. This framework has proven to be the best fit for applications based on the Open JDK technology stack.
Developing the DevOps processes using CI/CD best practices
After completing the conversion steps of our proven 12-step database migration process, our team started building the DevOps processes. We created an automated process for deployment of the customer’s databases and applications both on-premises and in the cloud. For all environments this process looks as follows:
We also designed an important option for the CI/CD scripts, which helped to discover all currently running application instances. So, when the development team needs to upgrade these applications, the no longer need to specify manually all the IP addresses of the application instances. The script instantly discovers and updates all applications, with the ability to still upgrade only the selected application instances if needed.
The following figure illustrates the general DevOps workflow that we created for their team.
In addition, we added the code checks, like check style, commit messages, etc. We also set up the notifications for end-users, sending them via Slack and email. Now, let’s show the technical approach in detail.
One of the most interesting parts of this diagram is related to the blue-green deployment technique. This approach allows for minimizing deployment downtime. However, it proves to be more expensive as we deploy 2 production environments and keep them running simultaneously for a certain period.
Blue-green deployment for minimal downtime
Our key challenge was to meet our main customer’s requirement to complete the migration to the production within a 54-hour window. This meant between Friday evening and the end of Sunday, all the automation for deployment, testing, and potential rollback procedures needed to finish. So, to minimize the deployment downtime, we utilized the blue-green deployment technique. Referring to industry CI/CD best practices, the DB Best team utilized Amazon Route 53 to ensure a seamless application upgrade.
This is how it works:
- Initially, the old application serves the entire load
- We set up the data replication to ensure that the new PostgreSQL database stores the same data as the source Oracle Database
- We utilize the scripts to deploy the updated application to Amazon EC2 and connect it to the Amazon Aurora PostgreSQL database
- Then we switch the Amazon Route 53 to the new application
You can see that database replication is an integral part of meeting the zero-downtime requirement. In fact, we set up a two-way replication. First, we set up the data replication from an old Oracle database to the new PostgreSQL database during the “go live” phase. Then, after that database was live, we needed to create a temporary backup. To do this, we set up the data replication from PostgreSQL to the source Oracle Database, which was running in the background for several days. Only after the customer declared that they are satisfied with the performance of their new cloud-based system did we turned off the source Oracle Database.
Our release guide included no less than 60 steps, and it was fully automated. This means that the application and database deployment were completely automated, while we performed some of the tasks of the release pipeline manually. This refers to stopping the Oracle Database, for example.
The bottom line is that we experienced only 1 minor issue which we were able to fix on the fly. Other than that, the “go live” went well and the cut-over lasted only 4 and a half hours – 12 times faster than prior deployments! Our customer was excited with the unprecedentedly high quality of our delivery.
This proven CI/CD process helped streamline the application and database deployment in the cloud. Utilizing this process allows for improving the overall system’s resiliency, optimizing costs, and reducing overhead for infrastructure maintenance. By the way, our customer still uses this DB Best solution for maintenance tasks as this process allows for adding new features into existing applications.
The DB Best team designed and delivered a fully automated process driven by Jenkins. We managed to exceed the customer’s expectations as they could now re-use these DevOps scripts for further maintenance of their applications.
Contact us to streamline your database migrations and automate the routine data management processes.