General performance

mioconnormioconnor Posts: 4
Hi,

We have SQL Source Control applied to a fairly large database 200 procs, 100 tables etc.

I have 6 developers working on a central copy, but the performance is extremely slow when committing changes if more than one person tries to do this at the same time.

One other issue is with metadata. We have a huge amount of metadata, 27000+ rows in one table. Unfortunately, SQL Source Control times itself out when analysing this data for check in.

Any tips or updates soon to be available as this is unfortunately making it unusable at the moment.


Many thanks,

Mike

Comments

  • James BJames B Posts: 1,124 Silver 4
    Thanks for your post, and I'm sorry you're having trouble with performance.

    The number of objects doesn't sound too bad, if it's, say, less than 1000. The performance should be pretty quick. Is everything local (i.e. the database and the repository?) In tests, a DB with around 8500 objects would normally refresh the commit tab in under a minute, what kind of times are you seeing?

    27000 data rows may cause you more of a performance problem - we're improving this, but large amounts of data still do cause issue. Really, it's designed for smaller amounts of static data records.
    Systems Software Engineer

    Redgate Software

  • The database is not local but the repository is. Times seem to be a lot longer when more than one person does the refresh. If there are no changes to commit, it will refresh in about 4 seconds - no problem here. But if more than one person refreshes it will rise to 30s or more. However, if there are objects to commit this will hit minutes. It seems the more changes the longer it takes and the more people that try to refresh the longer.

    One easy win you could do, is not keep refreshing the tab when ever you click on commit tab. Numerous times I have been frustrated as I move from the tab to another tab to check something only to move back and it refreshes again. Let the user do the refresh manually. This would lower the conflicts.

    Regarding the data rows, when do you think you will have a release for this improvement? I understand that 27000 is lot, but it would be amazing if you could increase the timeout maybe with a warning or something. To literally break it with a .net timeout is not great and makes it impossible to use. Increase the timeout and at least we can work with it in the background...

    Thoughts?
  • James BJames B Posts: 1,124 Silver 4
    The current (or possibly next) build should include some data performance increases. In addition, the tabs will perform background refreshes and notify you with a popup when this is complete, which may help.
    Which version are you currently running?
    Systems Software Engineer

    Redgate Software

  • Thanks for the reply James.

    We are using 3.0.11.


    Cheers,

    Mike
  • James BJames B Posts: 1,124 Silver 4
    we've released 3.0.12 now, which has the ability to use a faster method for data comparison, which may help you.
    This option is not enabled by default as it's a new feature, but to turn it on, just perform the following steps:


    1. Close SSMS.
    2. Open the following file - C:\Users\[username]\AppData\Local\Red Gate\SQL Source Control 3\ RedGate_SQLSourceControl_Engine_EngineOptions.xml
    3. Add the following EnableFastDataParse tag inside the EngineOpions tag:
    <?xml version="1.0" encoding="utf-16" standalone="yes"?>
    <!---->
    <EngineOptions version="2" type="EngineOptions">
    <EnableFastDataParse>true</EnableFastDataParse>
    </EngineOptions>
    4. Save file and restart SSMS.
    Systems Software Engineer

    Redgate Software

Sign In or Register to comment.