Automating Dacpac Comparison in CI/CD Pipeline: Deploy Only If There’s a Change! πŸ€–πŸ› 

This is continuation to previous article dacpac-magic-from-source-to-target-in-a-click-wrapped-in-a-pipeline

So you’re looking to automate Dacpac comparison as part of your CI/CD pipeline, and you only want to trigger a deployment if there are changes. Here’s a guide to do just that!

Technologies We’ll Use πŸ› 

  • Azure DevOps (for CI/CD)
  • PowerShell (for scripting)
  • sqlpackage (for Dacpac operations)

Step 1: Store the Previous Dacpac for Comparison πŸ—„

  • Store the last successfully deployed Dacpac somewhere accessible to the pipeline, like Azure Blob Storage or as an artifact in Azure DevOps.

Step 2: Create Azure DevOps CI Pipeline πŸ—

  1. Generate New Dacpac: As part of your build process, create a new Dacpac from your source database or codebase.
  2. Download Previous Dacpac: Add a step to download the previously deployed Dacpac.
  3. Compare Dacpacs: Create a custom script step that compares the new Dacpac with the old one. You can use PowerShell and sqlpackage for this.
- script: |
    sqlpackage /a:Script /sf:$(Pipeline.Workspace)/new.dacpac /tf:$(Pipeline.Workspace)/old.dacpac /op:$(Pipeline.Workspace)/comparisonResult.sql
  displayName: 'Compare Dacpac Files'
Code language: JavaScript (javascript)

Step 3: Custom Script to Check for Changes πŸ•΅οΈβ€β™€οΈ

Create a PowerShell script that reads comparisonResult.sql and checks if there are any changes.

# Read the comparison result
$comparisonResult = Get-Content -Path "$(Pipeline.Workspace)/comparisonResult.sql"

# Check if there are changes
if ($comparisonResult -eq $null -or $comparisonResult -eq "") {
    Write-Host "No changes detected. Skipping deployment."
    exit 0
} else {
    Write-Host "Changes detected. Proceeding to deployment."
    exit 1
}
Code language: PHP (php)

Step 4: Conditionally Deploy in Azure DevOps πŸš€

Add a conditional deployment step that only runs if the PowerShell script indicates there are changes.

- deployment: DeployDacpac
  condition: eq(variables['ChangesDetected'], 'true')
  pool:
    vmImage: 'windows-latest'
  environment: 'Production'
  strategy:
    runOnce:
      deploy:
        steps:
        - task: SqlAzureDacpacDeployment@1
          inputs:
            azureSubscription: 'Your-Azure-Subscription'
            ServerName: 'yourserver.database.windows.net'
            DatabaseName: 'YourDatabase'
            DacpacFile: '$(Pipeline.Workspace)/new.dacpac'
Code language: JavaScript (javascript)

In this example, the deployment job uses a custom variable ChangesDetected that you’d set based on the PowerShell script’s output.

Step 5: Update the “Old” Dacpac πŸ”„

After a successful deployment, upload the new Dacpac as the “old” Dacpac for future comparisons.

- script: |
    # Upload new.dacpac to Azure Blob or Artifact for future comparison
  condition: eq(variables['ChangesDetected'], 'true')
  displayName: 'Update Stored Dacpac'
Code language: PHP (php)

Below is a more detailed YAML example for Azure DevOps that includes all the necessary steps for automating the Dacpac comparison and conditional deployment.

trigger:
- main

variables:
  vmImageName: 'windows-latest'

stages:
- stage: BuildAndCompare
  jobs:
  - job: GenerateAndCompareDacpac
    pool:
      vmImage: $(vmImageName)
    steps:
    - script: |
        # Assume you have necessary steps to generate Dacpac from the staging environment
        # For example, using sqlpackage:
        sqlpackage /a:Extract /scs:"Server=stagingServer;Database=stagingDB;..." /tf:"$(Pipeline.Workspace)/staging.dacpac"
      displayName: 'Generate Dacpac from Staging'

    - script: |
        # Download the previous Dacpac from Blob Storage or Git
        # Azure CLI for Blob Storage example:
        az storage blob download --account-name myaccount --container-name mycontainer --name production.dacpac --file "$(Pipeline.Workspace)/production.dacpac"
      displayName: 'Download Previous Production Dacpac'

    - script: |
        # Compare Dacpacs
        sqlpackage /a:Script /sf:"$(Pipeline.Workspace)/staging.dacpac" /tf:"$(Pipeline.Workspace)/production.dacpac" /op:"$(Pipeline.Workspace)/comparisonResult.sql"
      displayName: 'Compare Dacpac Files'

    - powershell: |
        # PowerShell script to check if there are changes
        $comparisonResult = Get-Content -Path "$(Pipeline.Workspace)/comparisonResult.sql"
        if ($comparisonResult -eq $null -or $comparisonResult -eq "") {
            Write-Host "##vso[task.setvariable variable=areThereChanges;isOutput=true]false"
        } else {
            Write-Host "##vso[task.setvariable variable=areThereChanges;isOutput=true]true"
        }
      displayName: 'Check for Changes'
      name: checkForChanges

- stage: ConditionalDeploy
  condition: eq(dependencies.BuildAndCompare.outputs['GenerateAndCompareDacpac.checkForChanges.areThereChanges'], 'true')
  jobs:
  - deployment: DeployToProd
    pool:
      vmImage: $(vmImageName)
    steps:
    - task: SqlAzureDacpacDeployment@1
      inputs:
        azureSubscription: 'Your-Azure-Subscription'
        ServerName: 'yourserver.database.windows.net'
        DatabaseName: 'YourDatabase'
        DacpacFile: "$(Pipeline.Workspace)/staging.dacpac"

    - script: |
        # Archive the Dacpac to Blob Storage
        az storage blob upload --account-name myaccount --container-name mycontainer --name production.dacpac --file "$(Pipeline.Workspace)/staging.dacpac"
      displayName: 'Archive Dacpac'
Code language: PHP (php)

In this YAML file:

  • The GenerateAndCompareDacpac job generates a Dacpac from the staging database, downloads the last deployed Dacpac from Azure Blob storage, and then compares them.
  • The PowerShell step sets an output variable areThereChanges based on the comparison result.
  • The ConditionalDeploy stage checks this variable and only runs if areThereChanges is true. This means that deployment to production will only occur if there are changes.
  • The new Dacpac is archived to Azure Blob storage after successful deployment.

Remember to replace placeholders like yourserver.database.windows.net, YourDatabase, and Your-Azure-Subscription with your actual Azure details.

Alternate Tools for Dacpac Comparison and Deployment

While SSMS is commonly used for SQL Server operations, it can be a heavyweight solution for CI/CD pipelines and isn’t particularly scalable. Below are some alternative tools and methods for working with Dacpacs.

1. Azure Data Studio

  • Why Use It: A lighter-weight, cross-platform tool for working with SQL Server and other databases. It can be used to generate and deploy Dacpac files.

2. Command-Line Utilities

  • Why Use It: SqlPackage, a command-line utility that comes with the SQL Server Data-Tier Application Framework (DacFx), can be used for Dacpac operations. This utility can be installed independently on a build agent, removing the need for SSMS.

3. PowerShell Modules

  • Why Use It: SqlServer PowerShell module has cmdlets for working with Dacpacs (Publish-DacPac, New-DacPac, etc.), offering flexibility and integration with other PowerShell-based tooling.

4. DbUp

  • Why Use It: DbUp is a .NET library that helps you to deploy changes to SQL Server databases. While not Dacpac-based, it can be an alternative approach for schema deployments.

5. Redgate SQL Compare

  • Why Use It: This is a third-party tool that’s robust and provides a GUI as well as a command-line interface. It can compare database schemas and generate scripts for synchronization.

6. Liquibase

  • Why Use It: This is a more database-agnostic tool and works on the concept of changesets. If your environment includes different types of databases, Liquibase can be a good fit.

7. Flyway

  • Why Use It: Similar to Liquibase but simpler to use, Flyway allows you to version your database schema. It doesn’t use Dacpacs but SQL scripts.

8. Azure DevTest Labs

  • Why Use It: If you must use tools that depend on a GUI, you can set up automated VMs in Azure DevTest Labs to run these tools and then shut down the VMs to save costs.

9. Docker Containers

  • Why Use It: You can package your toolset into a Docker container and run it as part of the CI/CD pipeline, ensuring that your build agent remains lightweight.

Why to Consider Alternatives:

  • Scalability: Using lightweight, command-line tools can be more scalable and fit better into a DevOps pipeline.
  • Flexibility: Using database-agnostic tools or .NET libraries can give you more flexibility in your CI/CD pipeline.
  • Resource Efficiency: Many of these tools are less resource-intensive than installing the full SSMS suite on a build agent.
  • Cross-Platform: Tools like Azure Data Studio, Liquibase, or Docker-based solutions can be run on different platforms, offering more flexibility in your build environment.

By choosing an appropriate tool, you can keep your CI/CD pipeline lean and focused, while still leveraging powerful features for database schema comparisons and deployments.

I hope this post is helpful. Please share!

Leave a Reply

Scroll to Top

Discover more from DevOps AI/ML

Subscribe now to keep reading and get access to the full archive.

Continue reading