Options

Optimisation for large data syncronisation?

ChrisLuvChrisLuv Posts: 3
Hi,

I am syncronising a couple of quite large tables - 2 million rows.

Looking at the how the insert statements are being run through the software it looks like KEY locks are being generated, which is obviously sub-optimal when so many rows are being updated. Is there any way to automatically upgrade these to TABLE locks, or alter the Isolation Level - without altering the SQL Code.

Are there any other ways to speed up how long the sync takes? Its currently been running for 24 hours + and its trying to run a 4GB sql script.

NB This is the first time we are trying a sync and so I hope future updates will be a lot smaller.

Chris

Comments

  • Options
    Hi Chris,

    Currently the only option that you can use is whether to use transactions or not. We can look at improving this for a future version if possible.
  • Options
    Thanks Brian,

    I finally got my update finished, I had to quit and manually upload the large tables. I then used the SQL Script generated by Data Compare and split it up into "managable" chunks and ran them in QA, this seemed to work best.

    In spite of these teething problems I love the product, its saved me days of work. good work guys!

    Chris
Sign In or Register to comment.