Blog: Adding a test automation framework into a database migration CI/CD pipeline

In our prior blog post called Creating CI/CD solution for the database unification project, we covered the complexities of having to maintain a single application code base that supports both Oracle Database and Amazon Aurora PostgreSQL. We briefly mentioned our approach for extending our client’s Java with Apache Groovy to work with both database technologies.

In this blog post, we’ll show how we built a solid test automation framework as part of our database migration project running on AWS using TestNG, Apache Maven, Selenium WebDriver, Jenkins, and Amazon EC2. As of the publishing date for this blog post, our team has increased testing code coverage from 10% to 70% using our automated test framework. More importantly, we reduced the regression testing cycle time from 5 days to 5 hours! We’ll cover two specific challenges that we find with many applications.

  • Developing test cases for web user interfaces
  • Automation of batch jobs

Understanding the current testing process

Up until this point in time, our client used a combination of written demo scripts, end-user documentation, and software support clarification reports.

By analyzing the user interface along with the application’s Java and Groovy code base, our team created manual test cases to develop patterns for use in automating the tests. We then developed a method for generating test scripts based on the code that we validated with our client.

Developing a test automation framework for web UI tests

Let’s go over the automation framework we developed for web user interface test cases within the AWS cloud.

test automation framework — general diagram

Our web UI test automation framework leveraged TestNG, Apache Maven, Selenium WebDriver, and Java. Up to this point, our client had limited UI testing in place. Our team needed to design and implement test cases to improve overall code coverage. These scripts were then added into the CI/CD pipeline using  Jenkins as an orchestrator to make and compile a build and execute the test scripts. At the end of the CI/CD pipeline, the automation framework took the recorded results and generated reports for the development team using a customized version of our DB Best Test Automation Portal.

In the event of a failed run, our framework notifies the QA engineer with the information regarding the failed test cases. If the code is successfully validated with the tests on Oracle, the CI/CD pipeline then converts and deploys it to PostgreSQL databases. There we have another stage of code validation.

Modifying Apache Groovy HTML applications from Oracle to PostgreSQL

The Apache Groovy groovy-sql module provides a JDBC abstraction layer for accessing databases like Oracle and PostgreSQL. However, there enough differences in the way Oracle and PostgreSQL implement the JDBC interface, you can never assume that you can simply change the connection string, and everything magically works.

With over 700 existing cloud and SaaS deployments, it was either automate the tests or suffer the consequences with manual testing. By using AWS Schena Conversion Tool, our team was able to migrate the specific SQL queries from Oracle PL/SQL to PostgreSQL. We then programmatically converted the JDBC calls from Oracle to PostgreSQL based on the different execution patterns used in Groovy.

It was then a matter of generating test cases for the Web UI and incorporating them into our test framework.

Building a batch-job test automation framework

Let’s now take a look at how we incorporated database batch jobs into our test automation framework.

incorporating database batch jobs into the test automation framework

Our test automation framework for batch-job testing based on Java, TestNG, and Apache Maven runs the process of smoke testing first.

This framework involves 2 big testing layers: high-level tests and page object-based second layers with all the elements and methods initializations. Also, we have Helper classes for content checks, element checks, connection to Oracle and PostgreSQL databases, etc.

Only after the results of smoke testing, could we decide whether to proceed with further testing stages. To run the actual list of jobs, we used Google Sheets API to get filtered jobs from our document. We created classes for each product and used a DataProvider TestNG feature that allows running the same test method multiple times with different data-sets.

The following diagram shows the details for the automation of database batch jobs.

details of the database batch jobs automation

The next testing stage in our test automation process was reports comparison. We deployed the Reports Comparison tool developed by DB Best to compare the reports on Oracle Database and PostgreSQL.

Reports comparison dashboard

Finally, we executed the overall database state comparison using DB Best Migration Platform. This step helps verify whether all the applications behave exactly the way they did before the migration. When the reports are ready, the customer has a chance to keep track of the migration status through the DB Best Report Portal.

Quality is the result of a carefully constructed testing

This project required the DB Best team to work in several directions in parallel, including unification between Oracle Database and Amazon Aurora PostgreSQL, QA, and the development of new features. In addition to the database unification, our client enjoyed a test automation framework integrated with their new CI/CD pipeline, and a well-tuned path for future code merges.

Are you still suffering from an outdated testing environment that pulls your business down?

The DB Best team is ready to utilize a test automation pipeline for your system. Make sure you apply for a consultation on the best way to automate your daily testing routine and boost the frequency of new releases!

Share this...
Share on Facebook
Tweet about this on Twitter
Share on LinkedIn