Verifying your Database Deployment with Azure DevOps

Sign in to queue

The Discussion

  • User profile image
    Houssem​Dellai

    Thank you Damian for having me to talk in DevOps Lab :)

  • User profile image
    Liero
    Ah, why you always present SSDT/dacpac as a solution for DevOps, when it isn't?

    SSDT generates change scripts by comparing dacpac schema with actual database, however, there is no guarantee the scripts will work.

    SSDT can't handle database changes like changing column datatype, or renaming column, or splitting one table into more, etc without possible data loss. Since SSDT offer no way to run custom migration script that would handle those changes as part of dacpac deployment, SSDT is not suitable for database DevOps.

    Can you please explain, how should I deploy the changes described above using SSDT automated deployment that you have described in the video?
  • User profile image
    Damovisa
    Hi @Liero,

    SSDT can definitely be used in those scenarios, but it's not a magic bullet - you need to tell it how to handle the migration in cases like this.

    SSDT has a data loss safeguard which will fail the deployment in cases where there will be data loss. If turned on, the deployment will fail.

    It's generally good practice to run the migrations in earlier environments - you _never_ want the first time you run a migration to be in Production! In fact, that was the point of Houssem's last demo. He pulls down the production database into a pre-prod environment and runs the migration against it (then any tests to verify success), before going to production.

    Of course, you can disable the data loss safeguard and provide pre and post deployment scripts to define what should happen to migrate data safely. There's a great article by Dan Nolan from Redgate about this: https://www.red-gate.com/simple-talk/sql/database-delivery/fix-ten-ssdt-deployment-snags-without-readyroll/

    I hope that helps!
    Damian

Add Your 2 Cents