Options

Alternative ways to compare a 1TB database?

We migrate a 1 TB on prem SQL Server DB to Azure SQL DB and in the case of a "nogo" decision 1-5 days later, the new data should be transfered back from Azure to on prem. again.
There are no given solutions, no backup/restore, no fast "bacpac"-way, no reliable SQL replication etc.

Data Compare has not succeeded yet with this large DB and I understand it may not be fit for this scenario. But, I will try to use where-filters. I will prepare Id-tables from the changed db, taking all Ids for new/changed/deleted rows from a certain date and then sync these with RDC to the source DB. So far so good.

Then I do the actual Data Compare. I will use the same where-filter for both source and target. It will look like:
"[where] exist (select 1 from IdTable1 x where x.IdCol1 = Col1 and x.IdCol2 = IdCol2).
or, if the Id table has just one coIumn maybe "[where] IdCol1 IN (select IdCol1 from IdTable2).

What do you think, can the where-filter be optimized? Is there a better way to solve this with RDC?

Answers

  • Options
    hakancabhakancab Posts: 3 New member
    Now testing I see that this works. Instead of 12 000 000 rows I select 150 000 row in a certain table and the compare time was reduced from 90 min to 3 min. :-)
  • Options
    hakancabhakancab Posts: 3 New member
    How big change scripts can Data Compare handle?
    I got a 136 GB script, but this was three weeks changes so with just one week of changes it may become 45 GB.
Sign In or Register to comment.