Is SQL Source Control able to manage static data in tables with ~200k rows

Hi there,

I need a tool to manage static data in a database. We have some very big tables with around 200k rows. 
When we develop features in a feature branch we want to commit the changes in the database, so that we can merge them to dev branch after review.
Is SQL Source Control the right tool for us, or is the amount of data too big?

Best Regards

Answers

  • Hi @Danr,

    We don't recommend SQL Source Control for large static data tables. At the heart of the challenge is that data comparison gets slower as the tables you are comparing get larger and wider. This can lead for the plugin to need to use a lot of CPU and memory and also cause the user to wait longer periods for the comparison to complete. 

    For larger static data tables, SQL Change Automation has a "seed data" option which can work well in many scenarios. SQL Change Automation is available as part of the SQL Toolbelt. 

    If you don't have the SQL Toolbelt, you could implement a similar methodology to the seed data option with Flyway if desired.

    Hope this helps,
    Kendra
    -------------------------------
    @Kendra_Little
    Product Manager at Redgate
  • DanrDanr Posts: 5 New member
    @Kendra_Little so if I understand that correctly, seed data means that we have something like an initial state for the static data (provided by a csv file for example) and after that SQL Source Control can handle changes of 500-1000 rows per Branch easily? But what when some of the initial data will be changed (deleted or updated) is that also working?
  • Hi Dan,

    To clarify— the two products I mention are alternatives to SQL Source Control. SQL Source Control doesn’t have the functionality for managing data in this way.

    Hope this helps,
    Kendra
    -------------------------------
    @Kendra_Little
    Product Manager at Redgate
  • DanrDanr Posts: 5 New member
    @Kendra_Little
    Thanks for your update.

    Let's assume we would change our database. Tables with > 100k rows would be merged and converted to JSON and appended to about 5000 records.
    Would this perform better or would Source Control need more time to compare the strings which would end up with the same result?

    For example we have a table cars (5k records) and a table carparts (> 100k records). We could describe the carparts of each car with a json and then we could add that as a column in the cars table.
Sign In or Register to comment.