deploying data updates with SQL Change Automation

I was wondering how it should be configured or the best practices for using data update scripts with SQL Change Automation.

The scenario is that I need to add columns or tables as part of a database schema change but, at the same time, write a data patch script which will populate these new columns and tables with data. 

At the moment I write these in SSMS and run them manually but it would be good if I can incorporate them into the build and release process.

Obviously, these data scripts should only be run for a particular version update and not everytime that something changes.

Are there articles/tutorials/examples about this topic so that I can configure my system correctly?

Many thanks for any advice that you can provide.

Alan

Answers

  • SamTouraySamTouray Posts: 13 Bronze 2
    There is a "Migrations" tab in the SSMS Redgate SQL Source Control plugin that allows you to combine schema changes with data migrations, however it has the following warning: "Migration scripts are for occasional use only and can lead to performance issues...". I've not used them myself, but I guess schema changes with data migrations don't happen that often?
  • Hi @brookal ,

    Just tagging onto @SamTouray 's note , and I'm also assuming that you are authoring changes in SQL Source Control. If you're using SQL Change Automation for everything (authoring in Visual Studio with what used to be called ReadyRoll), just holler -- slightly different answer for that, but similar options.

    The method of doing this depends a little on which type of tables they are. If these are relatively small reference tables where the data only ever changes as part of the development process (not user input), such as something like a list of states, countries, etc, then you might want to add these as static data tables, which essentially checks the data into source control. Documents on that are here.  If you've got static data but it needs to be different in different environments, you can do that with a pre- or post- deployment script, as in this example.

    For data modifications to tables that aren't suitable for static data, I would also do this with a pre- or post-deployment script. The requirements for these are here, and the first one is really important: it needs to be easily rerunnable, so you want your script to have a way to indicate that it's been done and an initial check that looks for that and skips the step if possible.

    As @SamTouray mentioned, there are migration scripts in SQL Source Control, but I believe that in the future the functionality of the pre-and post- scripts will be what we rely on instead of the migration scripts, which may be eventually phased out as a feature.

    Hope this helps!
    Kendra
  • brookalbrookal Posts: 14 New member
    Thanks for the information .

    I can confirm that I am currently using SQL Change Automation so that my scripts can be deployed as part of the build and release pipelines in VSTS.

    The post deployment folder seems like a good place but, obviously I need to ensure that data update scripts are only run once.
  • SamTouraySamTouray Posts: 13 Bronze 2
    There is a "Migrations" tab in the SSMS Redgate SQL Source Control plugin that allows you to combine schema changes with data migrations, however it has the following warning: "Migration scripts are for occasional use only and can lead to performance issues...". I've not used them myself, but I guess schema changes with data migrations shouldn't happen that often, so you may be alright?
  • SamTouraySamTouray Posts: 13 Bronze 2
    There is a "Migrations" tab in the SSMS Redgate SQL Source Control plugin that allows you to combine schema changes with data migrations, however it has the following warning: "Migration scripts are for occasional use only and can lead to performance issues...". I've not used them myself, but I guess schema changes with data migrations shouldn't happen that often, so you may be alright?
  • I have a similar need, but am starting from nothing.  

    Currently our service operations team creates ad hoc sql scripts to fix issues discovered in different areas of different applications.

    They submit these to the DBA team for execution.  Originally it was thought the DBA team would review them and execute.  That is simply not the case outside of syntax error checks I cannot review for logic or data changes as I am not familiar with all the products and how the databases work.

    So my management is wanting to get me out of the middle of this as I am not serving any purpose except I have permissions to execute.

    I am trying to find out how to implement a process that the service team can create and test their scripts in DEV, then a system that will execute in production once validated/approved.

    I know it is not a true build and deployment type solution, but I am looking for something that would help automate this without build my own.

    Any suggestions would be great.
Sign In or Register to comment.