Verifying your Database Deployment with Azure DevOps

Play Verifying your Database Deployment with Azure DevOps
Sign in to queue

Description

While at Microsoft Ignite | The Tour in Berlin, Damian speaks to Microsoft MVP Houssem Dellai about some options for deploying your database alongside your application.

Houssem shows a few different ways to deploy database changes, including a clever pre-production verification process for ensuring your production deployment will succeed. Database upgrades are often the scariest part of your deployment process, so having a robust check before getting to production is very important.

Follow Houssem on Twitter: @HoussemDellai
Follow Damian on Twitter: @damovisa

Start with Azure DevOps for Free: http://dev.azure.com
Create a Free Azure Account: https://aka.ms/c9-azurefree

Tags:

Database, devops

Embed

Download

The Discussion

  • User profile image
    Houssem​Dellai

    Thank you Damian for having me to talk in DevOps Lab :)

  • User profile image
    Liero
    Ah, why you always present SSDT/dacpac as a solution for DevOps, when it isn't?

    SSDT generates change scripts by comparing dacpac schema with actual database, however, there is no guarantee the scripts will work.

    SSDT can't handle database changes like changing column datatype, or renaming column, or splitting one table into more, etc without possible data loss. Since SSDT offer no way to run custom migration script that would handle those changes as part of dacpac deployment, SSDT is not suitable for database DevOps.

    Can you please explain, how should I deploy the changes described above using SSDT automated deployment that you have described in the video?
  • User profile image
    Damovisa
    Hi @Liero,

    SSDT can definitely be used in those scenarios, but it's not a magic bullet - you need to tell it how to handle the migration in cases like this.

    SSDT has a data loss safeguard which will fail the deployment in cases where there will be data loss. If turned on, the deployment will fail.

    It's generally good practice to run the migrations in earlier environments - you _never_ want the first time you run a migration to be in Production! In fact, that was the point of Houssem's last demo. He pulls down the production database into a pre-prod environment and runs the migration against it (then any tests to verify success), before going to production.

    Of course, you can disable the data loss safeguard and provide pre and post deployment scripts to define what should happen to migrate data safely. There's a great article by Dan Nolan from Redgate about this: https://www.red-gate.com/simple-talk/sql/database-delivery/fix-ten-ssdt-deployment-snags-without-readyroll/

    I hope that helps!
    Damian
  • User profile image
    langman66
    Great video! Really helped us with a project where we're using Dacpac and Azure DevOps CI/CD. Would it be possible to share out some of your tasks/pipelines to a git repo?

    Thanks for putting this together.

    Chris
  • User profile image
    Houssem​Dellai
    Hi @langman66

    Here's the YAML/CI pipeline with the project source code on Github: https://github.com/HoussemDellai/WebAppWithDatabaseDemo/blob/master/vsts-ci.yml

    And this blog article shows more details : https://medium.com/devopslinks/devops-for-database-4e442abfd939

    I hope you'll find the details you are looking for :)
  • User profile image
    slusk
    @Damovisa (Damian) and @HoussemDellai (Houssem), excellent content, thanks so much for this video and all you continue to add to the community. Found this very useful, really like the pre-prod approach.

    Curious, in your implementations do you use Pre-Prod stage for other resources besides database and do you run any integration or functional tests against database in Pre-Prod? I know that SSDT will catch a number of things during Dacpac deploy but not necessarily everything.

    Thanks again!
  • User profile image
    Houssem​Dellai
    Hi @slusk !
    Thank you for your feedback :)
    Actually, it is possible to run integration/functional tests in this environment.
    The more tests you have, the more reliable the deployment will be.
    Here is the full PreProd env : https://medium.com/devopslinks/preprod-for-database-288aaec97a1
    from the article: https://medium.com/devopslinks/preprod-for-database-288aaec97a1
  • User profile image
    Liero
    @Damovisa:
    "SSDT can definitely be used in those scenarios, but it's not a magic bullet - you need to tell it how to handle the migration in cases like this."

    Well, that's the point. How do I tell it how to handle the migration, when it generates scripts automatically. I've found no way to plug-in custom code. Pre and post deployment scripts are just not enough.

  • User profile image
    Damovisa
    @Liero

    The pre and post deployment scripts are really the only way you can manage the process (or "tell it how to handle the migration").

    Because it's a bit of a black box, it's going to do whatever it thinks is correct for updating the schema, but you can control things with these pre and post stages. For example, temporarily store data in a table then copy and remove it at the end.

    Long story short, if you're not comfortable letting the tool make schema changes for you, then the model-based option is probably not right for you.

Add Your 2 Cents