Blog: Using Azure DevOps and PowerShell to automate testing of SSIS packages

One of our customers aimed to modernize their on-premises project and replace the aging AS/400 system with Microsoft SQL Server. Having an enormous amount of code, the existing program was almost maintainable and cried for a smart solution.

DB Best developed a brand-new way to manage epic projects. We required a smart solution to optimize costs and streamline the process. Our team leveraged the latest DevOps practices to create a solution that allows for managing the epic systems in layers.

In this post, we will share the best practices of reducing human errors and the cost of delivery with the SQL Server Integration Services (SSIS) based controlling framework.

Challenges of managing big projects

Despite the existence of version control systems like Git, SVN, or CVS, managing large projects remains a challenge. Here, we describe a case when the version control solution couldn’t be our silver bullet.

The background situation was tricky yet common for almost every team. We had 15 developers on a project with an enormous number of code lines. Each check-in included important changes or influenced system performance.

Our team thought we require a more advanced solution. We elaborated on a new approach to address that challenge. Since we had to deliver on limited time and budget, our architects thought of empowering a controlling framework to optimize the development process.

Building the SSIS controlling framework

Our team decided to incorporate a framework that would allow for ordering the program packages and running them in certain sequences. This approach allowed our developers to unit test their changes using the Visual Studio DevOps build process. Our developers start by creating a branch on the test server and make their changes to the SSIS package code. Once they check-in their code, they use the Queue build task to start the Azure Pipeline test run.

The developed solution divided the whole assets into several data layers where each layer has its master package. Also, each master package allows for calling its children packages and specifying the proper run order.

The diagram illustrates the process of running a separate data layer which uses PowerShell to start the build process.

DevOps practices

  • Generating the Batch ID. Creates a record in the ETLBatch table which tracks the test run.
  • Running tables. Updates tables that define the test runs using the Batch ID as a primary key to audit the run.
  • Receiving load sequences. This controller task takes the SSIS package test cases to create up to 8 threads simultaneously processing data. Also, the system allows for switching sequences for different test scenarios.

Multi-threaded execution

The following diagram shows how multi-threaded execution works for running test cases.

DevOps practices

  • Success / Failure. The execution framework looks at the generated audit records to report the run results.
  • Updating the Batch ID. The execution framework updates the Batch ID record and provides the developer with specific execution results in the event of a failure.

We utilized PowerShell to provide deploy a single package instead of deploying the entire SSIS project in case if a package fails or requires fixes. Microsoft provides an example of how to deploy multiple SSIS packages into a catalog. However, our parallel execution of SSIS packages required the PowerShell script to deploy a single package. The following script shows how to deploy a single package using SQL Server Authentication.

# Define the package location parameters which Assumes current directory set to source project in test environment
$ProjectPath = Get-Location
$SolutionName + ""
$ProjectName = "Metrics-Project"
$FolderName = "Load_Metrics"
$EnvironmentName = "Development"
$IspacFilePath = $ProjectPath + "" + $ProjectName + "\bin" + $EnvironmentName + "" + $ProjectName + ".ispac"

# Load the IntegrationServices Assembly

# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"

# Provide the SQL Server Authentication for the SSIS package. This is different from the server login which uses Integrated Security.
$ServerName = "localhost\default"
# Use a secure string for the password. Normally, the password would be in a secure store such as Azure Key Vault.
# See
$CatalogPwd = ConvertTo-SecureString -String '[email protected]$$w0rd' -AsPlainText -Force
$SSISCatalog = "SSISDB"

Write-Host "Connecting to server ..."

# Create a connection to the server
$sqlConnectionString = "Data Source=$ServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString

$integrationServices = New-Object "$ISNamespace.IntegrationServices" $sqlConnection

$catalog = $integrationServices.Catalogs[$SSISCatalog]

# Create the Integration Services object if it does not exist
if (!$catalog) {
    # Provision a new SSIS Catalog
    Write-Host "Creating SSIS Catalog ..."
    $catalog = New-Object "$ISNamespace.Catalog" ($integrationServices, $SSISCatalog, $CatalogPwd)

$folder = $catalog.Folders[$FolderName]

if (!$folder) {
    #Create a folder in SSISDB
    Write-Host "Creating Folder ..."
    $folder = New-Object "$ISNamespace.CatalogFolder" ($catalog, $FolderName, $FolderName)            

# Read the project file, and deploy it to the folder
Write-Host "Deploying Project ..."
[byte[]] $IspacFile = [System.IO.File]::ReadAllBytes($IspacFilePath)
$folder.DeployProject($ProjectName, $IspacFile)

$environment = $folder.Environments[$EnvironmentName]

if (!$environment) {
    Write-Host "Creating environment ..."
    $environment = New-Object "$ISNamespace.EnvironmentInfo" ($folder, $EnvironmentName, $EnvironmentName)

$project = $folder.Projects[$ProjectName]
$ref = $project.References[$EnvironmentName, $folder.Name]

if (!$ref) {
    # making project refer to this environment
    Write-Host "Adding environment reference to project ..."
    $project.References.Add($EnvironmentName, $folder.Name)

Benefits in facts and numbers

Implementing DevOps practices gave us an upper-hand to complete this project. According to the survey, our build process allowed the team to reduce the number of apparent and mechanical errors by 99%.

The SSIS controlling framework proved to be effective to forestall such errors as:

  • Utilizing the deleted and renamed objects
  • Syntax errors
  • Missing objects

Analyzing the performance on the particular project we could clearly see the advantages of implementing the SSIS controlling framework:

  • Increased the quality of delivery and provided for errorfree code output
  • Sufficiently decreased the time spent on finding errors in code
  • Reduced the cases of losing code changes during development
  • Optimized the communication inside the development team

SSIS controlling framework benefits

Take advantage of our best practices

Through the DB Best history, we managed huge customer’s projects and data assets and shaped the team approach to assembling tasks and code files.

Having a toolkit of proven solutions is as valuable as having experienced developers in the team. We at DB Best have both. What is more — we fervently share our developments. We suggest you assistance in utilizing DevOps practices to automate your development process and guarantee an error-free product that is easy to manage.

Contact us to learn how can your particular product benefit from implementing DevOps techniques. Always happy to assist. Your DB Best.

Share this...
Share on Facebook
Tweet about this on Twitter
Share on LinkedIn