Data Compare running out of memory

I'm on Windows 10 with 6 Gig of RAM and about 150 Gig of storage available.
I am having trouble updating large tables between two SQL2012 databases.
Data Compare is blowing up with out of memory error messages (these have been sent to redgate).
I've split the run of one database into several smaller runs to limit rows and even there I've noticed it's necessary to reboot between runs if a table of any size is being copied.
My machine runs at 99% of memory for these runs.
I don't think I'm running out of hard drive space for this - just memory.
The tables aren't that big. The largest one is 2,800,000 million records (I run it by itself ).
Most of the rest are < 100,000. The larger tables (over 300,000)seem to be the issue

Also, I don't see any recommendation for memory needed. Would adding more memory help?

Redgate data compare version is:
12.0.28.3138 professional
Tagged:

Comments

  • I don't know specifics about how much memory Data Compare needs (it probably varies significantly based on the type of data in the rows as well as the number of rows being compared) but that sounds like something that should work.

    First thing to look into would be updating to the latest version of Data Compare - we've done a bunch of bugfixing since 12.0.28, and I know in particular we've improved memory usage in the deployment wizard, so if that's where you're having trouble then it will probably help a lot.

    If that doesn't help (or if you're having memory problems in other parts of the UI such as doing the comparison or viewing results) then we could look into the problem in more detail
  • I am having the same problem. I have tables with 2 million records that take forever to compare and crash frequently when just updating one table. Tables with less than 100,000 records update fine as long as i don't try to update more than one table at a time. :(
    I have installed the most current version 13
  • KD1KD1 Posts: 6 Bronze 2
    I did upgrade to what was then current: Redgate 12.33.4490. This seemed to solve my issue though the largest table I've had to do on the newer version was only 1.2 million rows. It copied quite quickly.
  • Hi @JeanDorsey,

    Have you reviewed the options for comparison behaviors like filtering the data (you compare only the data you are really interested in, and the size of the temporary files is reduced), clear the Show identical values in results check box and If you are comparing tables with large amounts of data that changes infrequently, select the Use checksum comparison.

    Are the databases being compared on the same computer as SQL Data Compare? If not, I suggest installing SQL Data Compare on the same machine.

    You may find this page useful:
    https://documentation.red-gate.com/sdc13/troubleshooting/error-messages/troubleshooting-system-outofmemoryexception-during-comparison
    Kind regards

    Tianjiao Li | Redgate Software
    Have you visited our Help Center?
Sign In or Register to comment.