What are the challenges you face when working across database platforms? Take the survey

insufficient system memory error during synchronisation

ChrisClarkChrisClark Posts: 7

I'm using SqlDatacompare to produce a 10mb synchronisation script. When I attempt to apply this script using Sql Management Studio or SqlCmd, the script runs for approximately 20 mins and then fails with the error 'There is insufficient system memory to run this query' or 'There is insufficient memory available in the buffer pool.' The server I'm running this against has 1.25gb of memory, and a 3gb swap file. The last time I ran it I watched task manager for the duration of the process and the page file never went over 2gb, yet the script still failed.

I suspect that this problem is probably being caused by the fact that the script produced is a single batch; from other posts on this forum it seems that it's not possible to instruct SqlDataCompare to split the script into several batches, and I'm wary of writing code to do this automatically from a data integrity point of view, but I was wondering if anyone could suggest anything to make the script more memory efficient - I'm using the command line to generate the script (based on saved SDC packages) and perhaps I'm using the wrong switches?


  • Options
    Eddie DEddie D Posts: 1,792 Rose Gold 5
    Hi Chris

    Thank you for your post.

    You may want to try using the 'Where' filter to split the tables and generate a smaller compare and synchronization script. Using the GUI, edit your SQL Data Compare Project. On the 'Tables & Views' tab right click the tables with large amounts of data in them and choose 'Open WHERE clause editor' this will open up a dialog box where you can specify a where clause and split the large table into parts.

    Best Regards
    Eddie Davis
    Senior Product Support Engineer
    Redgate Software Ltd
    Email: support@red-gate.com
  • Options
    Hi Eddie,

    Thanks for your reply. I'm currently using a Where clause to synchronise a very specific subset of the data. The only other thing I could do is use a Where clause to horizontally partition data and generate a separate script for each partition; ideally I'd like to avoid this as it means I'd have to manually create a transaction to wrap the scripts in.


  • Options
    Further to this, I've just run the same script through QA, having manually edited it to include GOs after each set of table updates - the whole thing completed in 1 minute 10 seconds! I'd say that this is fairly conclusive but I can't see why Sql 2005 is unable to handle a 10mb batch....does anyone have any suggestions?
  • Options
    So the approach here is to use the Where clause to limit the ROWCOUNT, causing the user to manually change the where clause X number of times to compare and synchronize the entire recordset.

    This seems to be a conflict in automation. The approach should be to specify a SET ROWCOUNT = Y, (i.e. 10000), value. Red-Gate data compare would then generate multiple insert, update or delete scripts. Likewise the schemacompare would populate a rebuilt table like this, with no begin/committ transactions and assuming a backup is available and disk space optimization is preferred:

    SET ROWCOUNT 10000
    While 0 = 0 Begin
    -- populate rebuilt table
    INSERT INTO [dbo].[tmpxxx_table] ([tableID], [field1], [dbaDateCreated], [dbaDateUpdated], [dbaIsActive], [dbaIsDeleted])
    SELECT [tableID], [field1], [dbaDateCreated], [dbaDateUpdated], [dbaIsActive], [dbaIsDeleted] from [dbo].
Sign In or Register to comment.