Options

Applying schema changes is very slow - affected by large static table?

I've been working on getting our databases up and running in a local environment, and have been running into an issue where it takes a really long time (30+ minutes) to apply changes for a dozen or so stored procedures.  While it's applying changes, it doesn't seem to be using too much in the way of system resources.  I'm using SQL Source Control 6 for this.

On another question, I saw something about large amounts of static data causing commits to take a long time.  Would a large static table (the generate script is 275 MB) cause it to take a long time to apply changes to other objects, such as stored procedures?

Best Answer

  • Options
    AlexYatesAlexYates Posts: 264 Rose Gold 2
    edited May 3, 2018 9:16AM Answer ✓
    Yes - that could well be the issue. As a rule of thumb, if the static data table is over 1,000 rows expect an impact on performance. If the table is an order of magnitude bigger consider using a different strategy.

    However, if this is the issue there is a trick you can use to give you a significant performance boost:

    Setup tab > under Options just for this database, disable checks for changes to static data.

    Now the source code will still include the static data, but you have turned off the comparison by default. Now that static data will stop slowing down your refresh on the commit/get latest tab. Crucially, however, it will no longer notify you if the data changes. You will need to head back to the settings tab and flip it back on if/when you want to commit or pull down data updates.

    Hence, this fix will boost performance, but will mean your team need to communicate any static data updates with each other and manually push them up/pull them down.

    Also, this setting is individual to each dev machine. Hence, if using the dedicated model, each developer will individually need to flip the check back on, pull down the data, and flip the check back off again to get their performance back.
    Alex Yates
    DevOps Mentor and Coach

    Director of DLM Consultants
    Creator of Speaking Mentors
    Microsoft Data Platform MVP
    Friend of Redgate
    Twitter / LinkedIn

Answers

  • Options
    juliangoldsmithjuliangoldsmith Posts: 2 New member
    Thanks, Alex.

    After doing a bit of digging, I found out that my issue was completely unrelated to the static data.  My problem only happened in stored procs, and only certain ones.

    It turns out that some of our stored procs use a linked server to access the database they're running on.  (Presumably, due to the magic of copy and paste.)  The issue only happens when I try to update more than one stored proc that uses the linked server.

    What I think was happening, is that when SQL Source Control tried to update the procs, it locked the sys tables that handle object lookups, as part of a transaction.  Later in the same transaction, SQL Server tries to check that all the objects referenced in the stored procedure exist, and tries to look at the sys tables on the linked server.  Since it's going through a linked server, SQL Server tries to get a new lock on the sys tables, rather than use the one it already has open.  That, of course, deadlocks with the lock that SQL Source Control already has.

    My solution here is going to be to simply remove any loopback linked server references in my stored procedures.  They're pretty pointless, and make the queries harder to read anyway.
  • Options
    AlexYatesAlexYates Posts: 264 Rose Gold 2
    Glad you figured it out. That sounds like a good plan!
    Alex Yates
    DevOps Mentor and Coach

    Director of DLM Consultants
    Creator of Speaking Mentors
    Microsoft Data Platform MVP
    Friend of Redgate
    Twitter / LinkedIn
Sign In or Register to comment.