Best way to introduce SCA deployments to ops DBAs?

I have a license for the full SQL Toolbelt (heaven), but none of our ops DBAs currently have licenses.  We also do not have a CI/CD technology in place.  Our current process has devs delivering a set of "Apply" and "Rollback" SQL scripts in source control to "hand-off" to them for UAT, preprod, Load Test, and prod environments.  I'd like to discuss the best options for easing them into receiving deployment artifacts based on SCA projects.

Worst case is I send them a set of migration scripts from the SA project.

Next up the ladder is using one DB (probably prod) to generate release artifacts, and hand off the TargetedDeploymentScript.sql, massaged so that it can run on multiple servers and in different DBs.  They can then "approve" the script and exec in SSMS in sqlcmd mode.

I'm not sure they would work well with a different set of release scripts for each target DB, but we could explore that.

The previous possibilities at least would track the migrations in the __MigrationLog.  However, we would not get the benefits of snapshot that would give us drift and change reports, which is a huge perk.

Considering my licensing, what is the best way to hand them artifacts that will give us the benefits of snapshots, changes.html, drift.html, etc?  I imagine it's PowerShell driven, and I have my own deployment PS scripts based on the SCA cdmlets, but I'm not sure how that works with our licensing to hand off to the ops DBAs.  What are my best options here?


Best Answers

  • way0utwestway0utwest Posts: 310 Rose Gold 1
    For me I'd start with a PoC. Changing your existing process and altering what works is hard. It's scary, and you might break somethign.

    What I'd do is create a new db in dev and a new one in prod (and QA/test/staging/etc). Put in a single table in each, so they're synced. Now, add a proc in dev, check into VCS. Set up CI/CD, and give the DBAs control over the release. Let them see what moves as you practice some small changes. Once you work out the best process for your group, with flow, timing, notifications, permissions, etc., you can talk about moving a prod db over.
  • PeterDanielsPeterDaniels Posts: 89 Bronze 3
    Follow up:
    Initial prod deployment was a hack - ssms generated ddl and dml scripts. But,I did include the __MigrationLog table and data.

    Once I cleared up my SCA deployment issues, I have been able to pass on the New-DatabaseReleaseArtifact and Export-DatabaseReleaseArtifact generated TargetedReleaseScript.sql, with the server name check removed, to the ops DBAs. They have been willing to execute that manually in SSMS in SQLCMD mode. This is working as an intermediate process until I can socialize deployment via ci/cd tech.


  • PeterDanielsPeterDaniels Posts: 89 Bronze 3
    edited January 31, 2019 6:26PM
    Thanks, @way0utwest.  I like the way you're thinking and agree with the concept.  However, I have a current, real project with a new DB that I've chosen, wisely or otherwise, to manage using SCA.  So, now it's time to promote to the higher environments (above QA), and I'm scrambling to prepare for the handoff to the ops DBAs.  I guess worst case is I just generate DDL and DML and hand that off, losing the benefits of tracking changes with the __MigrationLog, snapshots/drift/change reports, etc.
  • You can pull out the script to create those objects from your baseline script. Once those are in prod, you can give the update.SQL script from the create release step to the DBAs to review and run manually. This will update the migration log as part of the script until you implement more automation
  • PeterDanielsPeterDaniels Posts: 89 Bronze 3
    edited February 3, 2019 3:45PM
    Thank you, @way0utwest .  To clarify, are you suggesting taking the "1.0.0-Baseline" script and deploying that initially.  Then use the PS cmdlets (New-DatabaseReleaseArtifact) to generate the TargetedDeploymentScript.sql moving forward? 

    Edit: I've been using a combination of unchecking "mark first folder as baseline" + adding <SkipBaselineCheck>True</SkipBaselineCheck> to get New-DatabaseReleaseArtifact to include the whole set of migrations to deploy correctly to a newly created DB. Otherwise, it errors.  The original "target" db used during project creation to injury the baseline script had a number of initial schema objects.
  • @way0utwest - would you be willing to respond to my last question por favor?
  • I  tend to leave the first script (001) as the baseline and letting the system detect this. It's supposed to detect if the target (prod) has the objects. However, this is where you try to get prod synced with a baseline first and then start adding objects to dev that are added to the project separately.

    I'm not 100% sure how the <skipbaselinecheck> affects this, so let me ask someone else
  • Thanks, @way0utwest. I've discovered that my SQL 2014 model database had my domain account as a db user, so it was seeing any "empty" target dbs as not empty, and therefore marking the baseline script as deployed. This made the project error on deployment. Not sure how the user got into the model db - perhaps during instance install, because I definitely didn't put it there. Thus, I used the techniques described above to ensure that the baseline script gets deployed.

    Other option was to remove that user, and SCA saw the target as empty. Since I don't know exactly how SCA determines l "empty", I'm keeping the skip property in place for now.

    In other news, I ended up just manually scripting the ddl and dml to get the project delivered to the ops DBAs, including the road in the __migrationLog table. Hoping I can return to SCA based deployment artifacts again.
  • Follow up: since I included the __migrationLog table and its data as part of the initial, non-SCA generated deployment to our "higher" (ops DBA controlled) environments, I was able to introduce the new idea of using a SSQLCM script to the ops DBAs.

    The second deployment was the addition of a single new migration in a New-DatabaseReleaseArtifact generated TargetedDeploymentScript.sql. I manually modified it to comment out the target sever check, so they would have one script (currently a requirement) for all of their servers. They successfully ran that in uat and load test servers, so we're at least gaining the benefit of SCA generated deployment scripts and tracking in the __migrationLog table.

    Concurrently, I've had some meetings to introduce the idea of using PowerShell to deploy using the Use-DatabaseReleaseArtifact cmdlet. That would give us the added benefit of snapshot, drift check, etc. as an intermediate step.

    I've also showed then a taste of my POC using SCA, bamboo, and Octopus deploy, but my sense is we're months (or longer) away from being able to get buy-in for that move.

    Thank you for your help, Steve. Honored to have a celebrity work with me in the forums.


  • Glad that you're making progress. Baby steps is the best way, and thank you for the very kind words.

Sign In or Register to comment.